Importing libraries¶

In [1]:
import numpy as np
import matplotlib.pyplot as plt
import random
import pandas as pd
import tensorflow as tf
from sklearn.model_selection import train_test_split
from sklearn.neural_network import MLPRegressor
from sklearn.metrics import mean_squared_error
from sklearn.metrics import mean_absolute_error
from sklearn.preprocessing import StandardScaler
from sklearn.preprocessing import Normalizer
from tensorflow.keras.layers import Dense
from tensorflow.keras.models import Sequential
from tensorflow.keras.optimizers import SGD, Adam
from keras.layers import Flatten,Dense,BatchNormalization,Activation,Dropout
from keras.callbacks import ReduceLROnPlateau
from sklearn.metrics import classification_report, confusion_matrix, ConfusionMatrixDisplay

Questão 1¶

Utilize redes neurais perceptron de múltiplas camadas para aproximar as funções abaixo. Apresente um gráfico com a curva da função analítica e a curva da função aproximada pela rede neural. Apresente também a curva do erro médio de treinamento com relação ao número de épocas e a curva do erro médio com o conjunto de validação. Procure definir para cada função a arquitetura da rede neural perceptron, isto é, o número de entradas, o número de neurônios em cada camada e o número de neurônios camada de saída.

Observações: Como se trata de um problema de aproximação de funções, considere a ca- mada de saída do tipo linear puro. Isto é, φ(v)=v, onde v é o potencial de ativação.

a)¶

$f(x_1,x_2) = (1 -x_1)^2 + 100 (x_2 - (x_1)^2)^2$ com $-10 ≤ x_1 ≤ 10, -10 ≤ x_2 ≤10$

In [ ]:
# function's definition
def f(x1,x2):
  return (1 - x1)**2 + 100*(x2 - (x1)**2)**2
In [ ]:
# generating points
x1, x2 = np.meshgrid(np.linspace(-10, 10, 100), np.linspace(-10, 10, 100))
y = f(x1,x2)
In [ ]:
fig, ax = plt.subplots(figsize=(10, 7), subplot_kw=dict(projection='3d'))

ax.plot_surface(x1, x2, y)

ax.set(
    xlabel='$x_1$',
    ylabel='$x_2$',
    zlabel='$f(x_1, x_2)$'
)

plt.tight_layout()
plt.show()
In [ ]:
# split training and test
x_train, x_test, y_train, y_test = train_test_split(
    np.vstack([x1.flatten(), x2.flatten()]).T, 
    y.flatten(), 
    test_size=0.2, 
    random_state=42
)
In [ ]:
fig, ax = plt.subplots(figsize=(10, 7), subplot_kw=dict(projection='3d'))

ax.plot_wireframe(x1, x2, y, linewidths=0.5, color='lightgrey')
ax.scatter(x_train[:,0], x_train[:,1], y_train, s=1, color='darkorange', label='Training data')
ax.scatter(x_test[:,0], x_test[:,1], y_test, s=5, color='darkgreen', label='Test data')

ax.set(
    xlabel='$x_1$',
    ylabel='$x_2$',
    zlabel='$f(x_1, x_2)$'
)

plt.legend()
plt.tight_layout()
plt.show()
In [ ]:
# created scaler
scaler = StandardScaler()

y_train = y_train.reshape(-1,1)
y_test = y_test.reshape(-1,1)

# fit normalizer on training dataset
scaler.fit(y_train)
 
# transform training dataset
y_train = scaler.transform(y_train)
 
# transform test dataset
y_test = scaler.transform(y_test)
In [ ]:
# verifying shapes
print("X train shape: ", x_train.shape)
print("Y train shape: ", y_train.shape)
print("X test shape: ", x_test.shape)
print("Y test shape: ", y_test.shape)
X train shape:  (8000, 2)
Y train shape:  (8000, 1)
X test shape:  (2000, 2)
Y test shape:  (2000, 1)
In [ ]:
mlp = Sequential([
    Dense(64, activation='relu', input_shape=(2,)),
    Dense(32, activation='relu'),
    Dense(16, activation='relu'),
    Dense(8, activation='relu'),
    Dense(4, activation='relu'),
    Dense(1, activation='linear')
])

mlp.compile(
    loss='mean_squared_error',
    optimizer='adam'
)

mlp.summary()
Model: "sequential_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense_6 (Dense)             (None, 64)                192       
                                                                 
 dense_7 (Dense)             (None, 32)                2080      
                                                                 
 dense_8 (Dense)             (None, 16)                528       
                                                                 
 dense_9 (Dense)             (None, 8)                 136       
                                                                 
 dense_10 (Dense)            (None, 4)                 36        
                                                                 
 dense_11 (Dense)            (None, 1)                 5         
                                                                 
=================================================================
Total params: 2,977
Trainable params: 2,977
Non-trainable params: 0
_________________________________________________________________
In [ ]:
history = mlp.fit(
    x_train, y_train,
    batch_size=8,
    epochs=2000,
    validation_split=0.1,
    callbacks=[
        tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10),
        tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=5, min_lr=0.0001)
    ]
)
Epoch 1/2000
900/900 [==============================] - 2s 1ms/step - loss: 0.2403 - val_loss: 0.0481 - lr: 0.0010
Epoch 2/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0301 - val_loss: 0.0225 - lr: 0.0010
Epoch 3/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0200 - val_loss: 0.0144 - lr: 0.0010
Epoch 4/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0144 - val_loss: 0.0141 - lr: 0.0010
Epoch 5/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0098 - val_loss: 0.0073 - lr: 0.0010
Epoch 6/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0094 - val_loss: 0.0056 - lr: 0.0010
Epoch 7/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0102 - val_loss: 0.0024 - lr: 0.0010
Epoch 8/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0082 - val_loss: 0.0071 - lr: 0.0010
Epoch 9/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0055 - val_loss: 0.0031 - lr: 0.0010
Epoch 10/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0053 - val_loss: 0.0126 - lr: 0.0010
Epoch 11/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0051 - val_loss: 8.4492e-04 - lr: 0.0010
Epoch 12/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0045 - val_loss: 0.0091 - lr: 0.0010
Epoch 13/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0103 - val_loss: 7.3677e-04 - lr: 0.0010
Epoch 14/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0041 - val_loss: 0.0061 - lr: 0.0010
Epoch 15/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0034 - val_loss: 0.0015 - lr: 0.0010
Epoch 16/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0047 - val_loss: 0.0060 - lr: 0.0010
Epoch 17/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0028 - val_loss: 0.0046 - lr: 0.0010
Epoch 18/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0045 - val_loss: 0.0064 - lr: 0.0010
Epoch 19/2000
900/900 [==============================] - 1s 1ms/step - loss: 3.8528e-04 - val_loss: 3.2380e-04 - lr: 1.0000e-04
Epoch 20/2000
900/900 [==============================] - 1s 1ms/step - loss: 2.4806e-04 - val_loss: 2.2849e-04 - lr: 1.0000e-04
Epoch 21/2000
900/900 [==============================] - 1s 1ms/step - loss: 2.4695e-04 - val_loss: 2.0553e-04 - lr: 1.0000e-04
Epoch 22/2000
900/900 [==============================] - 1s 1ms/step - loss: 2.3648e-04 - val_loss: 1.9860e-04 - lr: 1.0000e-04
Epoch 23/2000
900/900 [==============================] - 1s 1ms/step - loss: 2.3406e-04 - val_loss: 1.8407e-04 - lr: 1.0000e-04
Epoch 24/2000
900/900 [==============================] - 1s 1ms/step - loss: 2.1895e-04 - val_loss: 2.1349e-04 - lr: 1.0000e-04
Epoch 25/2000
900/900 [==============================] - 1s 1ms/step - loss: 2.5439e-04 - val_loss: 2.7080e-04 - lr: 1.0000e-04
Epoch 26/2000
900/900 [==============================] - 1s 1ms/step - loss: 2.3294e-04 - val_loss: 3.1382e-04 - lr: 1.0000e-04
Epoch 27/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.9288e-04 - val_loss: 2.5049e-04 - lr: 1.0000e-04
Epoch 28/2000
900/900 [==============================] - 2s 2ms/step - loss: 2.0941e-04 - val_loss: 1.4517e-04 - lr: 1.0000e-04
Epoch 29/2000
900/900 [==============================] - 2s 2ms/step - loss: 2.0231e-04 - val_loss: 1.7252e-04 - lr: 1.0000e-04
Epoch 30/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.8211e-04 - val_loss: 1.2884e-04 - lr: 1.0000e-04
Epoch 31/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.8255e-04 - val_loss: 1.1797e-04 - lr: 1.0000e-04
Epoch 32/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.5825e-04 - val_loss: 2.3869e-04 - lr: 1.0000e-04
Epoch 33/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.8320e-04 - val_loss: 1.7549e-04 - lr: 1.0000e-04
Epoch 34/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.5552e-04 - val_loss: 1.2200e-04 - lr: 1.0000e-04
Epoch 35/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.6122e-04 - val_loss: 1.1288e-04 - lr: 1.0000e-04
Epoch 36/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.6115e-04 - val_loss: 1.4661e-04 - lr: 1.0000e-04
Epoch 37/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.7000e-04 - val_loss: 1.8074e-04 - lr: 1.0000e-04
Epoch 38/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.5897e-04 - val_loss: 1.3702e-04 - lr: 1.0000e-04
Epoch 39/2000
900/900 [==============================] - 2s 2ms/step - loss: 1.8850e-04 - val_loss: 9.9578e-05 - lr: 1.0000e-04
Epoch 40/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.8244e-04 - val_loss: 1.0522e-04 - lr: 1.0000e-04
Epoch 41/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3757e-04 - val_loss: 1.5331e-04 - lr: 1.0000e-04
Epoch 42/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3224e-04 - val_loss: 2.4568e-04 - lr: 1.0000e-04
Epoch 43/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.6857e-04 - val_loss: 1.4375e-04 - lr: 1.0000e-04
Epoch 44/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3927e-04 - val_loss: 3.8073e-04 - lr: 1.0000e-04
Epoch 45/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.6187e-04 - val_loss: 4.2486e-04 - lr: 1.0000e-04
Epoch 46/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.4784e-04 - val_loss: 1.7491e-04 - lr: 1.0000e-04
Epoch 47/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.5885e-04 - val_loss: 1.2637e-04 - lr: 1.0000e-04
Epoch 48/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3876e-04 - val_loss: 9.0454e-05 - lr: 1.0000e-04
Epoch 49/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.7796e-04 - val_loss: 1.3724e-04 - lr: 1.0000e-04
Epoch 50/2000
900/900 [==============================] - 2s 2ms/step - loss: 1.3802e-04 - val_loss: 1.1495e-04 - lr: 1.0000e-04
Epoch 51/2000
900/900 [==============================] - 2s 2ms/step - loss: 1.4784e-04 - val_loss: 1.0661e-04 - lr: 1.0000e-04
Epoch 52/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2565e-04 - val_loss: 9.0812e-05 - lr: 1.0000e-04
Epoch 53/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.4147e-04 - val_loss: 1.0544e-04 - lr: 1.0000e-04
Epoch 54/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.4119e-04 - val_loss: 5.1870e-04 - lr: 1.0000e-04
Epoch 55/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.4186e-04 - val_loss: 7.9401e-05 - lr: 1.0000e-04
Epoch 56/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.7324e-04 - val_loss: 9.3971e-05 - lr: 1.0000e-04
Epoch 57/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3775e-04 - val_loss: 1.4554e-04 - lr: 1.0000e-04
Epoch 58/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2992e-04 - val_loss: 9.5643e-05 - lr: 1.0000e-04
Epoch 59/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2934e-04 - val_loss: 7.1712e-05 - lr: 1.0000e-04
Epoch 60/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3317e-04 - val_loss: 1.5182e-04 - lr: 1.0000e-04
Epoch 61/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2192e-04 - val_loss: 9.4130e-05 - lr: 1.0000e-04
Epoch 62/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2367e-04 - val_loss: 1.1338e-04 - lr: 1.0000e-04
Epoch 63/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.5802e-04 - val_loss: 1.1068e-04 - lr: 1.0000e-04
Epoch 64/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1099e-04 - val_loss: 1.8395e-04 - lr: 1.0000e-04
Epoch 65/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2799e-04 - val_loss: 6.3500e-05 - lr: 1.0000e-04
Epoch 66/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2981e-04 - val_loss: 1.4402e-04 - lr: 1.0000e-04
Epoch 67/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2969e-04 - val_loss: 6.7762e-05 - lr: 1.0000e-04
Epoch 68/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3877e-04 - val_loss: 9.6731e-05 - lr: 1.0000e-04
Epoch 69/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1277e-04 - val_loss: 1.3771e-04 - lr: 1.0000e-04
Epoch 70/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1472e-04 - val_loss: 1.2268e-04 - lr: 1.0000e-04
Epoch 71/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1190e-04 - val_loss: 7.2661e-05 - lr: 1.0000e-04
Epoch 72/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3607e-04 - val_loss: 1.3418e-04 - lr: 1.0000e-04
Epoch 73/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1247e-04 - val_loss: 2.3557e-04 - lr: 1.0000e-04
Epoch 74/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1410e-04 - val_loss: 9.0923e-05 - lr: 1.0000e-04
Epoch 75/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1888e-04 - val_loss: 2.2146e-04 - lr: 1.0000e-04
In [ ]:
fig, ax = plt.subplots(figsize=(10, 7))

ax.plot(history.history['loss'], label='Training Loss')
ax.plot(history.history['val_loss'], label='Validation Loss')

ax.set(
    title='Processo de otimização das funções de perda',
    ylabel='Loss',
    xlabel='Epoch'
)

plt.legend()
plt.tight_layout()
plt.show()
In [ ]:
predictions = mlp.predict(x_test)
inverse_predictions = scaler.inverse_transform(predictions)
63/63 [==============================] - 0s 996us/step
In [ ]:
fig, ax = plt.subplots(figsize=(15, 9), subplot_kw=dict(projection='3d'))

ax.plot_wireframe(x1, x2, y, linewidths=0.5, color='lightgrey')
ax.scatter(x_test[:,0], x_test[:,1], scaler.inverse_transform(y_test), s=14, color='C0', label='Training data')
ax.scatter(x_test[:,0], x_test[:,1], inverse_predictions, s=15, marker='^', color='C1', label='Test data')

ax.set(
    xlabel='$x_1$',
    ylabel='$x_2$',
    zlabel='$f(x_1, x_2)$'
)

plt.legend()
plt.tight_layout()
plt.show()
In [ ]:
mse = mean_squared_error(scaler.inverse_transform(y_test), inverse_predictions)
rmse = mean_squared_error(scaler.inverse_transform(y_test), inverse_predictions, squared=False)
mae = mean_absolute_error(scaler.inverse_transform(y_test), inverse_predictions)
In [ ]:
print(f"Mean Squared Error: {mse}")
print(f"Root Mean Squared Error: {rmse}")
print(f"Mean Absolute Error: {mae}")
Mean Squared Error: 16880097.554985106
Root Mean Squared Error: 4108.5395890736045
Mean Absolute Error: 2477.902080862183

b)¶

$x_1^2 + x_2^2 + 2x_1x_2cos(\pi x_1x_2) + x_1 + x_2 -1$ com $|x_1| ≤1 , |x_2| ≤ 1$

In [ ]:
# function's definition
def g(x1,x2):
  return x1**2 + x2**2 + 2 * x1 * x2 * np.cos(np.pi * x1 * x2) + x1 + x2 - 1
In [ ]:
# generating points
x1, x2 = np.meshgrid(np.linspace(-1, 1, 100), np.linspace(-1, 1, 100))
y = g(x1,x2)
In [ ]:
# plotting surface
fig, ax = plt.subplots(figsize=(10, 7), subplot_kw=dict(projection='3d'))
ax.plot_surface(x1, x2, y)

ax.set(
    xlabel='$x_1$',
    ylabel='$x_2$',
    zlabel='$f(x_1, x_2)$'
)

plt.tight_layout()
plt.show()
In [ ]:
# split train test
x_train, x_test, y_train, y_test = train_test_split(
    np.vstack([x1.flatten(), x2.flatten()]).T, 
    y.flatten(), 
    test_size=0.2, 
    random_state=505
)
In [ ]:
# surface os points to test and to train
fig, ax = plt.subplots(figsize=(10, 7), subplot_kw=dict(projection='3d'))

ax.plot_wireframe(x1, x2, y, linewidths=0.5, color='lightgrey')
ax.scatter(x_train[:,0], x_train[:,1], y_train, s=1, color='darkorange', label='Training data')
ax.scatter(x_test[:,0], x_test[:,1], y_test, s=5, color='darkgreen', label='Test data')

ax.set(
    xlabel='$x_1$',
    ylabel='$x_2$',
    zlabel='$f(x_1, x_2)$'
)

plt.legend()
plt.tight_layout()
plt.show()
In [ ]:
mlp = Sequential([
    Dense(64, activation='relu', input_shape=(2,)),
    Dense(32, activation='relu'),
    Dense(16, activation='relu'),
    Dense(8, activation='relu'),
    Dense(4, activation='relu'),
    Dense(1, activation='linear')
])

mlp.compile(
    loss='mean_squared_error',
    optimizer='adam'
)

mlp.summary()
Model: "sequential_3"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense_18 (Dense)            (None, 64)                192       
                                                                 
 dense_19 (Dense)            (None, 32)                2080      
                                                                 
 dense_20 (Dense)            (None, 16)                528       
                                                                 
 dense_21 (Dense)            (None, 8)                 136       
                                                                 
 dense_22 (Dense)            (None, 4)                 36        
                                                                 
 dense_23 (Dense)            (None, 1)                 5         
                                                                 
=================================================================
Total params: 2,977
Trainable params: 2,977
Non-trainable params: 0
_________________________________________________________________
In [ ]:
# training the model
history = mlp.fit(
    x_train, y_train,
    batch_size=8,
    epochs=2000,
    validation_split=0.1,
    callbacks=[
        tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10),
        tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=5, min_lr=0.0001)
    ]
)
Epoch 1/2000
900/900 [==============================] - 2s 1ms/step - loss: 0.3755 - val_loss: 0.1831 - lr: 0.0010
Epoch 2/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.1221 - val_loss: 0.0754 - lr: 0.0010
Epoch 3/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0488 - val_loss: 0.0392 - lr: 0.0010
Epoch 4/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0238 - val_loss: 0.0227 - lr: 0.0010
Epoch 5/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0188 - val_loss: 0.0208 - lr: 0.0010
Epoch 6/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0147 - val_loss: 0.0175 - lr: 0.0010
Epoch 7/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0147 - val_loss: 0.0202 - lr: 0.0010
Epoch 8/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0169 - val_loss: 0.0165 - lr: 0.0010
Epoch 9/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0135 - val_loss: 0.0165 - lr: 0.0010
Epoch 10/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0148 - val_loss: 0.0140 - lr: 0.0010
Epoch 11/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0100 - val_loss: 0.0106 - lr: 0.0010
Epoch 12/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0096 - val_loss: 0.0085 - lr: 0.0010
Epoch 13/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0058 - val_loss: 0.0064 - lr: 0.0010
Epoch 14/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0059 - val_loss: 0.0047 - lr: 0.0010
Epoch 15/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0049 - val_loss: 0.0043 - lr: 0.0010
Epoch 16/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0033 - val_loss: 0.0029 - lr: 0.0010
Epoch 17/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0027 - val_loss: 0.0028 - lr: 0.0010
Epoch 18/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0024 - val_loss: 0.0037 - lr: 0.0010
Epoch 19/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0030 - val_loss: 0.0022 - lr: 0.0010
Epoch 20/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0035 - val_loss: 0.0016 - lr: 0.0010
Epoch 21/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0013 - val_loss: 0.0039 - lr: 0.0010
Epoch 22/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0017 - val_loss: 0.0012 - lr: 0.0010
Epoch 23/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0017 - val_loss: 0.0011 - lr: 0.0010
Epoch 24/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0017 - val_loss: 0.0026 - lr: 0.0010
Epoch 25/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0010 - val_loss: 0.0046 - lr: 0.0010
Epoch 26/2000
900/900 [==============================] - 2s 2ms/step - loss: 0.0016 - val_loss: 8.2744e-04 - lr: 0.0010
Epoch 27/2000
900/900 [==============================] - 2s 2ms/step - loss: 0.0014 - val_loss: 0.0011 - lr: 0.0010
Epoch 28/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0018 - val_loss: 7.3059e-04 - lr: 0.0010
Epoch 29/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0013 - val_loss: 0.0021 - lr: 0.0010
Epoch 30/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0013 - val_loss: 5.2872e-04 - lr: 0.0010
Epoch 31/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.0995e-04 - val_loss: 7.3022e-04 - lr: 0.0010
Epoch 32/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0019 - val_loss: 0.0014 - lr: 0.0010
Epoch 33/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.4458e-04 - val_loss: 9.2240e-04 - lr: 0.0010
Epoch 34/2000
900/900 [==============================] - 1s 1ms/step - loss: 0.0019 - val_loss: 7.6538e-04 - lr: 0.0010
Epoch 35/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.7375e-04 - val_loss: 5.7007e-04 - lr: 0.0010
Epoch 36/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.5156e-04 - val_loss: 3.2450e-04 - lr: 1.0000e-04
Epoch 37/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.4211e-04 - val_loss: 3.0920e-04 - lr: 1.0000e-04
Epoch 38/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3987e-04 - val_loss: 3.2047e-04 - lr: 1.0000e-04
Epoch 39/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.4100e-04 - val_loss: 2.8895e-04 - lr: 1.0000e-04
Epoch 40/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3653e-04 - val_loss: 3.0373e-04 - lr: 1.0000e-04
Epoch 41/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3535e-04 - val_loss: 2.6592e-04 - lr: 1.0000e-04
Epoch 42/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3654e-04 - val_loss: 2.6799e-04 - lr: 1.0000e-04
Epoch 43/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.3816e-04 - val_loss: 2.5570e-04 - lr: 1.0000e-04
Epoch 44/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2641e-04 - val_loss: 2.9339e-04 - lr: 1.0000e-04
Epoch 45/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2121e-04 - val_loss: 3.2909e-04 - lr: 1.0000e-04
Epoch 46/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.2684e-04 - val_loss: 2.3325e-04 - lr: 1.0000e-04
Epoch 47/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1775e-04 - val_loss: 2.6653e-04 - lr: 1.0000e-04
Epoch 48/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1174e-04 - val_loss: 2.3168e-04 - lr: 1.0000e-04
Epoch 49/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1471e-04 - val_loss: 2.5295e-04 - lr: 1.0000e-04
Epoch 50/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.1344e-04 - val_loss: 2.6298e-04 - lr: 1.0000e-04
Epoch 51/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.0876e-04 - val_loss: 2.0776e-04 - lr: 1.0000e-04
Epoch 52/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.0718e-04 - val_loss: 2.1121e-04 - lr: 1.0000e-04
Epoch 53/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.0307e-04 - val_loss: 2.0588e-04 - lr: 1.0000e-04
Epoch 54/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.0388e-04 - val_loss: 1.9177e-04 - lr: 1.0000e-04
Epoch 55/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.0178e-04 - val_loss: 2.0274e-04 - lr: 1.0000e-04
Epoch 56/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.0178e-04 - val_loss: 2.2856e-04 - lr: 1.0000e-04
Epoch 57/2000
900/900 [==============================] - 1s 1ms/step - loss: 1.0126e-04 - val_loss: 2.3026e-04 - lr: 1.0000e-04
Epoch 58/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.3505e-05 - val_loss: 2.0378e-04 - lr: 1.0000e-04
Epoch 59/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.6313e-05 - val_loss: 1.8932e-04 - lr: 1.0000e-04
Epoch 60/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.1047e-05 - val_loss: 1.8083e-04 - lr: 1.0000e-04
Epoch 61/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.8763e-05 - val_loss: 1.8094e-04 - lr: 1.0000e-04
Epoch 62/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.1101e-05 - val_loss: 1.8211e-04 - lr: 1.0000e-04
Epoch 63/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.0412e-05 - val_loss: 1.9709e-04 - lr: 1.0000e-04
Epoch 64/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.3721e-05 - val_loss: 2.6898e-04 - lr: 1.0000e-04
Epoch 65/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.3494e-05 - val_loss: 1.8946e-04 - lr: 1.0000e-04
Epoch 66/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.1262e-05 - val_loss: 1.7357e-04 - lr: 1.0000e-04
Epoch 67/2000
900/900 [==============================] - 1s 1ms/step - loss: 8.6589e-05 - val_loss: 1.6051e-04 - lr: 1.0000e-04
Epoch 68/2000
900/900 [==============================] - 1s 1ms/step - loss: 8.8745e-05 - val_loss: 1.6204e-04 - lr: 1.0000e-04
Epoch 69/2000
900/900 [==============================] - 1s 1ms/step - loss: 8.2716e-05 - val_loss: 1.5193e-04 - lr: 1.0000e-04
Epoch 70/2000
900/900 [==============================] - 1s 1ms/step - loss: 8.5516e-05 - val_loss: 1.4889e-04 - lr: 1.0000e-04
Epoch 71/2000
900/900 [==============================] - 1s 1ms/step - loss: 8.7297e-05 - val_loss: 1.4578e-04 - lr: 1.0000e-04
Epoch 72/2000
900/900 [==============================] - 1s 1ms/step - loss: 8.0910e-05 - val_loss: 1.4944e-04 - lr: 1.0000e-04
Epoch 73/2000
900/900 [==============================] - 1s 1ms/step - loss: 8.8776e-05 - val_loss: 1.7773e-04 - lr: 1.0000e-04
Epoch 74/2000
900/900 [==============================] - 1s 1ms/step - loss: 8.2522e-05 - val_loss: 1.4596e-04 - lr: 1.0000e-04
Epoch 75/2000
900/900 [==============================] - 1s 1ms/step - loss: 8.4255e-05 - val_loss: 1.3306e-04 - lr: 1.0000e-04
Epoch 76/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.8852e-05 - val_loss: 1.6480e-04 - lr: 1.0000e-04
Epoch 77/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.2539e-05 - val_loss: 1.3074e-04 - lr: 1.0000e-04
Epoch 78/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.7328e-05 - val_loss: 1.2497e-04 - lr: 1.0000e-04
Epoch 79/2000
900/900 [==============================] - 1s 1ms/step - loss: 9.1113e-05 - val_loss: 1.4063e-04 - lr: 1.0000e-04
Epoch 80/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.1518e-05 - val_loss: 1.1932e-04 - lr: 1.0000e-04
Epoch 81/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.8567e-05 - val_loss: 1.3577e-04 - lr: 1.0000e-04
Epoch 82/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.6544e-05 - val_loss: 1.2487e-04 - lr: 1.0000e-04
Epoch 83/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.4925e-05 - val_loss: 1.4861e-04 - lr: 1.0000e-04
Epoch 84/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.4571e-05 - val_loss: 1.4264e-04 - lr: 1.0000e-04
Epoch 85/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.9797e-05 - val_loss: 1.1468e-04 - lr: 1.0000e-04
Epoch 86/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.4416e-05 - val_loss: 1.3066e-04 - lr: 1.0000e-04
Epoch 87/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.3910e-05 - val_loss: 1.5394e-04 - lr: 1.0000e-04
Epoch 88/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.9360e-05 - val_loss: 1.1626e-04 - lr: 1.0000e-04
Epoch 89/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.7308e-05 - val_loss: 1.1188e-04 - lr: 1.0000e-04
Epoch 90/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.2721e-05 - val_loss: 1.1300e-04 - lr: 1.0000e-04
Epoch 91/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.7185e-05 - val_loss: 1.2724e-04 - lr: 1.0000e-04
Epoch 92/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.7134e-05 - val_loss: 1.1569e-04 - lr: 1.0000e-04
Epoch 93/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.8117e-05 - val_loss: 1.1088e-04 - lr: 1.0000e-04
Epoch 94/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.8846e-05 - val_loss: 1.3633e-04 - lr: 1.0000e-04
Epoch 95/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.7340e-05 - val_loss: 1.2110e-04 - lr: 1.0000e-04
Epoch 96/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.7974e-05 - val_loss: 1.7295e-04 - lr: 1.0000e-04
Epoch 97/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.7266e-05 - val_loss: 1.1850e-04 - lr: 1.0000e-04
Epoch 98/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.2741e-05 - val_loss: 1.2930e-04 - lr: 1.0000e-04
Epoch 99/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.4239e-05 - val_loss: 1.0425e-04 - lr: 1.0000e-04
Epoch 100/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.8227e-05 - val_loss: 1.0128e-04 - lr: 1.0000e-04
Epoch 101/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.1065e-05 - val_loss: 1.0580e-04 - lr: 1.0000e-04
Epoch 102/2000
900/900 [==============================] - 1s 1ms/step - loss: 7.2595e-05 - val_loss: 1.0268e-04 - lr: 1.0000e-04
Epoch 103/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.0114e-05 - val_loss: 1.3312e-04 - lr: 1.0000e-04
Epoch 104/2000
900/900 [==============================] - 1s 2ms/step - loss: 6.9939e-05 - val_loss: 1.1080e-04 - lr: 1.0000e-04
Epoch 105/2000
900/900 [==============================] - 2s 2ms/step - loss: 6.1671e-05 - val_loss: 9.3537e-05 - lr: 1.0000e-04
Epoch 106/2000
900/900 [==============================] - 2s 2ms/step - loss: 6.9090e-05 - val_loss: 1.0073e-04 - lr: 1.0000e-04
Epoch 107/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.0073e-05 - val_loss: 1.4321e-04 - lr: 1.0000e-04
Epoch 108/2000
900/900 [==============================] - 1s 1ms/step - loss: 5.8665e-05 - val_loss: 1.0323e-04 - lr: 1.0000e-04
Epoch 109/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.1377e-05 - val_loss: 2.0407e-04 - lr: 1.0000e-04
Epoch 110/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.5973e-05 - val_loss: 9.2555e-05 - lr: 1.0000e-04
Epoch 111/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.1832e-05 - val_loss: 1.1252e-04 - lr: 1.0000e-04
Epoch 112/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.1293e-05 - val_loss: 8.5153e-05 - lr: 1.0000e-04
Epoch 113/2000
900/900 [==============================] - 1s 1ms/step - loss: 5.6221e-05 - val_loss: 9.7618e-05 - lr: 1.0000e-04
Epoch 114/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.3836e-05 - val_loss: 9.4311e-05 - lr: 1.0000e-04
Epoch 115/2000
900/900 [==============================] - 1s 1ms/step - loss: 5.8413e-05 - val_loss: 1.3108e-04 - lr: 1.0000e-04
Epoch 116/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.1561e-05 - val_loss: 9.7013e-05 - lr: 1.0000e-04
Epoch 117/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.2772e-05 - val_loss: 1.0615e-04 - lr: 1.0000e-04
Epoch 118/2000
900/900 [==============================] - 1s 1ms/step - loss: 5.7437e-05 - val_loss: 1.3355e-04 - lr: 1.0000e-04
Epoch 119/2000
900/900 [==============================] - 1s 1ms/step - loss: 5.7079e-05 - val_loss: 9.6611e-05 - lr: 1.0000e-04
Epoch 120/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.2548e-05 - val_loss: 1.0475e-04 - lr: 1.0000e-04
Epoch 121/2000
900/900 [==============================] - 1s 1ms/step - loss: 5.7250e-05 - val_loss: 1.1300e-04 - lr: 1.0000e-04
Epoch 122/2000
900/900 [==============================] - 1s 1ms/step - loss: 6.4314e-05 - val_loss: 9.1217e-05 - lr: 1.0000e-04
In [ ]:
# plotting loss function
fig, ax = plt.subplots(figsize=(10, 7))

ax.plot(history.history['loss'], label='Training Loss')
ax.plot(history.history['val_loss'], label='Validation Loss')

ax.set(
    title='Processo de otimização das funções de perda',
    ylabel='Loss',
    xlabel='Epoch'
)

plt.legend()
plt.tight_layout()
plt.show()
In [ ]:
# true points and predicted points
fig, ax = plt.subplots(figsize=(15, 9), subplot_kw=dict(projection='3d'))

ax.plot_wireframe(x1, x2, y, linewidths=0.5, color='lightgrey')
ax.scatter(x_test[:,0], x_test[:,1], y_test, s=14, color='C0', label='Training data')
ax.scatter(x_test[:,0], x_test[:,1], mlp.predict(x_test), s=15, marker='^', color='C1', label='Test data')

ax.set(
    xlabel='$x_1$',
    ylabel='$x_2$',
    zlabel='$f(x_1, x_2)$'
)

plt.legend()
plt.tight_layout()
plt.show()
63/63 [==============================] - 0s 1ms/step
In [ ]:
mse = mean_squared_error(y_test, mlp.predict(x_test))
rmse = mean_squared_error(y_test, mlp.predict(x_test), squared=False)
mae = mean_absolute_error(y_test, mlp.predict(x_test))
63/63 [==============================] - 0s 903us/step
63/63 [==============================] - 0s 988us/step
63/63 [==============================] - 0s 763us/step
In [ ]:
print(f"Mean Squared Error: {mse}")
print(f"Root Mean Squared Error: {rmse}")
print(f"Mean Absolute Error: {mae}")
Mean Squared Error: 4.9327674554969855e-05
Root Mean Squared Error: 0.007023366326411421
Mean Absolute Error: 0.0053168037015980385

Questão 2¶

Considere o problema de classificação de padrões bidimensionais constituído neste caso de 5 padrões. A distribuição dos padrões tem como base um quadrado centrado na origem interceptando os eixos nos pontos +1 e -1 de cada eixo. Os pontos +1 e -1 de cada eixo são centros de quatro semicírculos que se interceptam no interior do quadrado originando as classes 1,2,3,4 e a outra classe corresponde as regiões de não interseção. Após gerar aleato- riamente os dados que venham formar estas distribuições de dados, selecione um conjunto de treinamento e um conjunto de validação com o rótulo de cada classe. Solucione este problema considerando uma rede perceptron de múltiplas camada. Apresente na solução a curva do erro médio de treinamento e a curva do erro médio de teste. Apresente também a matriz de confusão.

Semicircle equations¶

Gráfico das equações descritas abaixo:
In [ ]:
# Semicírculo azul
def c1(x, y):
    if (x + 1)**2 + y**2 <= 1:
      return 1
    else:
      return 0

#Sermicírculo verde
def c2(x, y):
    if (x - 1)**2 + y**2 <= 1:
      return 1
    else:
      return 0

#Semicírculo amarelo
def c3(x, y):
    if x**2 + (y + 1)**2 <= 1:
      return 1
    else:
      return 0

#Semicírculo vermelha=o
def c4(x, y):
    if x**2 + (y - 1)**2 <= 1:
      return 1
    else:
      return 0

Classificando os pontos¶

  • Classe 0: região de fora
  • Classe 1: região dentro dos semicírculos azul e vermelho
  • Classe 2: região dentro dos semicírculos verde e vermelho
  • Classe 3: região dentro dos semicírculos verde e amarelo
  • Classe 4: região dentro dos semicírculos azul e amarelo
In [ ]:
x, y = np.meshgrid(np.linspace(-1, 1, 100), np.linspace(-1, 1, 100))

points = np.vstack(list(zip(x.ravel(), y.ravel())))

lista = []
for x_i, y_i in points:
  if c1(x_i, y_i) + c4(x_i, y_i) == 2:
    lista.append(1)
  elif c2(x_i, y_i) + c4(x_i, y_i) == 2:
    lista.append(2)
  elif c2(x_i, y_i) + c3(x_i, y_i) == 2:
    lista.append(3)
  elif c1(x_i, y_i) + c3(x_i, y_i) == 2:
    lista.append(4)
  else:
    lista.append(0)

labels = np.array(lista)

Train test split¶

In [ ]:
x_train, x_test, y_train, y_test = train_test_split(points, labels, test_size=0.2, stratify=labels)
In [ ]:
print(f"shape x train: {x_train.shape}")
print(f"shape y train: {y_train.shape}")
print(f"shape x test: {x_test.shape}")
print(f"shape y test: {y_test.shape}")
shape x train: (8000, 2)
shape y train: (8000,)
shape x test: (2000, 2)
shape y test: (2000,)
In [ ]:
fig, ax = plt.subplots(ncols=3, figsize=(40, 12))

color1 = (181/255, 181/255, 181/255, 1.0)
color2 = (38/255, 118/255, 222/255, 1.0)
color3 = (38/255, 222/255, 118/255, 1.0)
color4 = (235/255, 91/255, 156/255, 1.0)
color5 = (240/255, 130/255, 44/255, 1.0)

colormap = np.array([color1, color2, color3, color4, color5])

dataset_scatter = ax[0].scatter(points[:,0], points[:,1], c=colormap[labels], marker='d')

ax[0].set(
    title='Dataset',
    xlabel='$x$',
    ylabel='$y$'
)

ax[1].scatter(x_train[:,0], x_train[:,1], c=colormap[y_train], marker='d')

ax[1].set(
    title='Training set',
    xlabel='$x$',
    ylabel='$y$'
)

ax[2].scatter(x_test[:,0], x_test[:,1], c=colormap[y_test], marker='d')

ax[2].set(
    title='Test set',
    xlabel='$x$',
    ylabel='$y$'
)

from matplotlib.patches import Patch
from matplotlib.lines import Line2D

legend_elements = [
    Line2D([0], [0], marker='o', color='w', label='Scatter', markerfacecolor=(181/255, 181/255, 181/255), markersize=15),
    Line2D([0], [0], marker='o', color='w', label='Scatter', markerfacecolor=(38/255, 118/255, 222/255), markersize=15),
    Line2D([0], [0], marker='o', color='w', label='Scatter', markerfacecolor=(38/255, 222/255, 118/255), markersize=15),
    Line2D([0], [0], marker='o', color='w', label='Scatter', markerfacecolor=(235/255, 91/255, 156/255), markersize=15),
    Line2D([0], [0], marker='o', color='w', label='Scatter', markerfacecolor=(240/255, 130/255, 44/255), markersize=15)
]

fig.legend(
    legend_elements,
    ['0', '1', '2', '3', '4'],
    loc='lower center',
    title='Classes'
)

plt.show()

Define the model¶

In [ ]:
mlp = Sequential([
    Dense(64, activation='relu', input_shape=(2,)),
    Dense(32, activation='relu'),
    Dense(16, activation='relu'),
    Dense(8, activation='relu'),
    Dense(5, activation='softmax')
])

mlp.compile(
    loss='sparse_categorical_crossentropy',
    optimizer='adam',
    metrics=['acc']
)


mlp.summary()
Model: "sequential_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense_5 (Dense)             (None, 64)                192       
                                                                 
 dense_6 (Dense)             (None, 32)                2080      
                                                                 
 dense_7 (Dense)             (None, 16)                528       
                                                                 
 dense_8 (Dense)             (None, 8)                 136       
                                                                 
 dense_9 (Dense)             (None, 5)                 45        
                                                                 
=================================================================
Total params: 2,981
Trainable params: 2,981
Non-trainable params: 0
_________________________________________________________________

Training the model¶

In [ ]:
history = mlp.fit(
    x_train, y_train.reshape((-1,1)),
    validation_split=0.1,
    batch_size=10,
    epochs=2000,
    callbacks=[
        tf.keras.callbacks.EarlyStopping(monitor='loss', patience=10),
        tf.keras.callbacks.ReduceLROnPlateau(monitor='loss', factor=0.1, patience=5, min_lr=0.0001)
    ]
)
Epoch 1/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.7271 - acc: 0.6861 - val_loss: 0.4313 - val_acc: 0.8350 - lr: 0.0010
Epoch 2/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.3175 - acc: 0.8826 - val_loss: 0.2762 - val_acc: 0.8975 - lr: 0.0010
Epoch 3/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.2327 - acc: 0.9156 - val_loss: 0.2305 - val_acc: 0.9125 - lr: 0.0010
Epoch 4/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.1948 - acc: 0.9292 - val_loss: 0.1951 - val_acc: 0.9162 - lr: 0.0010
Epoch 5/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1680 - acc: 0.9410 - val_loss: 0.1584 - val_acc: 0.9438 - lr: 0.0010
Epoch 6/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1588 - acc: 0.9375 - val_loss: 0.1548 - val_acc: 0.9375 - lr: 0.0010
Epoch 7/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1468 - acc: 0.9440 - val_loss: 0.1422 - val_acc: 0.9438 - lr: 0.0010
Epoch 8/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1337 - acc: 0.9468 - val_loss: 0.1154 - val_acc: 0.9550 - lr: 0.0010
Epoch 9/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1251 - acc: 0.9507 - val_loss: 0.1369 - val_acc: 0.9350 - lr: 0.0010
Epoch 10/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1272 - acc: 0.9499 - val_loss: 0.1201 - val_acc: 0.9575 - lr: 0.0010
Epoch 11/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1169 - acc: 0.9533 - val_loss: 0.1246 - val_acc: 0.9438 - lr: 0.0010
Epoch 12/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1170 - acc: 0.9517 - val_loss: 0.1711 - val_acc: 0.9212 - lr: 0.0010
Epoch 13/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1165 - acc: 0.9532 - val_loss: 0.1301 - val_acc: 0.9438 - lr: 0.0010
Epoch 14/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1164 - acc: 0.9524 - val_loss: 0.0993 - val_acc: 0.9588 - lr: 0.0010
Epoch 15/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1120 - acc: 0.9519 - val_loss: 0.1097 - val_acc: 0.9488 - lr: 0.0010
Epoch 16/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1062 - acc: 0.9569 - val_loss: 0.1492 - val_acc: 0.9337 - lr: 0.0010
Epoch 17/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1104 - acc: 0.9533 - val_loss: 0.0911 - val_acc: 0.9588 - lr: 0.0010
Epoch 18/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1109 - acc: 0.9550 - val_loss: 0.1208 - val_acc: 0.9525 - lr: 0.0010
Epoch 19/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1010 - acc: 0.9592 - val_loss: 0.1122 - val_acc: 0.9513 - lr: 0.0010
Epoch 20/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0980 - acc: 0.9599 - val_loss: 0.0863 - val_acc: 0.9638 - lr: 0.0010
Epoch 21/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0965 - acc: 0.9596 - val_loss: 0.0889 - val_acc: 0.9613 - lr: 0.0010
Epoch 22/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0994 - acc: 0.9597 - val_loss: 0.1672 - val_acc: 0.9262 - lr: 0.0010
Epoch 23/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0979 - acc: 0.9589 - val_loss: 0.1394 - val_acc: 0.9463 - lr: 0.0010
Epoch 24/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1035 - acc: 0.9558 - val_loss: 0.0798 - val_acc: 0.9638 - lr: 0.0010
Epoch 25/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0924 - acc: 0.9633 - val_loss: 0.0743 - val_acc: 0.9688 - lr: 0.0010
Epoch 26/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0937 - acc: 0.9601 - val_loss: 0.1888 - val_acc: 0.9425 - lr: 0.0010
Epoch 27/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.1012 - acc: 0.9583 - val_loss: 0.0916 - val_acc: 0.9600 - lr: 0.0010
Epoch 28/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0925 - acc: 0.9617 - val_loss: 0.0768 - val_acc: 0.9613 - lr: 0.0010
Epoch 29/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0933 - acc: 0.9632 - val_loss: 0.1094 - val_acc: 0.9513 - lr: 0.0010
Epoch 30/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0930 - acc: 0.9617 - val_loss: 0.0828 - val_acc: 0.9613 - lr: 0.0010
Epoch 31/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0574 - acc: 0.9829 - val_loss: 0.0603 - val_acc: 0.9737 - lr: 1.0000e-04
Epoch 32/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0554 - acc: 0.9847 - val_loss: 0.0665 - val_acc: 0.9712 - lr: 1.0000e-04
Epoch 33/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0549 - acc: 0.9856 - val_loss: 0.0587 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 34/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0556 - acc: 0.9844 - val_loss: 0.0582 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 35/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0544 - acc: 0.9847 - val_loss: 0.0599 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 36/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0542 - acc: 0.9847 - val_loss: 0.0614 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 37/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0544 - acc: 0.9851 - val_loss: 0.0634 - val_acc: 0.9737 - lr: 1.0000e-04
Epoch 38/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0536 - acc: 0.9846 - val_loss: 0.0570 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 39/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0530 - acc: 0.9854 - val_loss: 0.0511 - val_acc: 0.9837 - lr: 1.0000e-04
Epoch 40/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0536 - acc: 0.9858 - val_loss: 0.0554 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 41/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0527 - acc: 0.9862 - val_loss: 0.0612 - val_acc: 0.9737 - lr: 1.0000e-04
Epoch 42/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0527 - acc: 0.9856 - val_loss: 0.0540 - val_acc: 0.9837 - lr: 1.0000e-04
Epoch 43/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0519 - acc: 0.9861 - val_loss: 0.0597 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 44/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0513 - acc: 0.9869 - val_loss: 0.0547 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 45/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0514 - acc: 0.9861 - val_loss: 0.0577 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 46/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0511 - acc: 0.9850 - val_loss: 0.0548 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 47/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0518 - acc: 0.9847 - val_loss: 0.0580 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 48/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0509 - acc: 0.9861 - val_loss: 0.0565 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 49/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0502 - acc: 0.9868 - val_loss: 0.0508 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 50/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0505 - acc: 0.9854 - val_loss: 0.0611 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 51/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0492 - acc: 0.9862 - val_loss: 0.0535 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 52/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0493 - acc: 0.9858 - val_loss: 0.0552 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 53/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0495 - acc: 0.9868 - val_loss: 0.0559 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 54/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0487 - acc: 0.9856 - val_loss: 0.0526 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 55/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0490 - acc: 0.9867 - val_loss: 0.0565 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 56/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0484 - acc: 0.9864 - val_loss: 0.0520 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 57/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0481 - acc: 0.9882 - val_loss: 0.0494 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 58/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0476 - acc: 0.9882 - val_loss: 0.0550 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 59/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0478 - acc: 0.9872 - val_loss: 0.0577 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 60/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0480 - acc: 0.9864 - val_loss: 0.0521 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 61/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0476 - acc: 0.9872 - val_loss: 0.0478 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 62/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0477 - acc: 0.9865 - val_loss: 0.0485 - val_acc: 0.9812 - lr: 1.0000e-04
Epoch 63/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0471 - acc: 0.9857 - val_loss: 0.0510 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 64/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0464 - acc: 0.9890 - val_loss: 0.0604 - val_acc: 0.9737 - lr: 1.0000e-04
Epoch 65/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0470 - acc: 0.9878 - val_loss: 0.0538 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 66/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0463 - acc: 0.9878 - val_loss: 0.0523 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 67/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0462 - acc: 0.9868 - val_loss: 0.0496 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 68/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0458 - acc: 0.9871 - val_loss: 0.0516 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 69/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0458 - acc: 0.9875 - val_loss: 0.0488 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 70/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0460 - acc: 0.9871 - val_loss: 0.0504 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 71/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0462 - acc: 0.9868 - val_loss: 0.0586 - val_acc: 0.9725 - lr: 1.0000e-04
Epoch 72/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0450 - acc: 0.9879 - val_loss: 0.0515 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 73/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0450 - acc: 0.9882 - val_loss: 0.0560 - val_acc: 0.9725 - lr: 1.0000e-04
Epoch 74/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0450 - acc: 0.9890 - val_loss: 0.0512 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 75/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0452 - acc: 0.9881 - val_loss: 0.0480 - val_acc: 0.9812 - lr: 1.0000e-04
Epoch 76/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0441 - acc: 0.9889 - val_loss: 0.0443 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 77/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0446 - acc: 0.9868 - val_loss: 0.0481 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 78/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0442 - acc: 0.9876 - val_loss: 0.0562 - val_acc: 0.9737 - lr: 1.0000e-04
Epoch 79/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0439 - acc: 0.9885 - val_loss: 0.0434 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 80/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0443 - acc: 0.9867 - val_loss: 0.0488 - val_acc: 0.9812 - lr: 1.0000e-04
Epoch 81/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0437 - acc: 0.9893 - val_loss: 0.0470 - val_acc: 0.9812 - lr: 1.0000e-04
Epoch 82/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0436 - acc: 0.9885 - val_loss: 0.0456 - val_acc: 0.9875 - lr: 1.0000e-04
Epoch 83/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0432 - acc: 0.9885 - val_loss: 0.0515 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 84/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0443 - acc: 0.9871 - val_loss: 0.0486 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 85/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0431 - acc: 0.9886 - val_loss: 0.0446 - val_acc: 0.9875 - lr: 1.0000e-04
Epoch 86/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0430 - acc: 0.9886 - val_loss: 0.0506 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 87/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0430 - acc: 0.9886 - val_loss: 0.0503 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 88/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0430 - acc: 0.9890 - val_loss: 0.0529 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 89/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0423 - acc: 0.9887 - val_loss: 0.0550 - val_acc: 0.9712 - lr: 1.0000e-04
Epoch 90/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0426 - acc: 0.9889 - val_loss: 0.0499 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 91/2000
720/720 [==============================] - 3s 4ms/step - loss: 0.0415 - acc: 0.9906 - val_loss: 0.0428 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 92/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0422 - acc: 0.9882 - val_loss: 0.0472 - val_acc: 0.9837 - lr: 1.0000e-04
Epoch 93/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0422 - acc: 0.9883 - val_loss: 0.0434 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 94/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0425 - acc: 0.9883 - val_loss: 0.0464 - val_acc: 0.9837 - lr: 1.0000e-04
Epoch 95/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0415 - acc: 0.9903 - val_loss: 0.0567 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 96/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0416 - acc: 0.9903 - val_loss: 0.0423 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 97/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0418 - acc: 0.9889 - val_loss: 0.0464 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 98/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0410 - acc: 0.9906 - val_loss: 0.0502 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 99/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0414 - acc: 0.9896 - val_loss: 0.0430 - val_acc: 0.9887 - lr: 1.0000e-04
Epoch 100/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0408 - acc: 0.9894 - val_loss: 0.0437 - val_acc: 0.9875 - lr: 1.0000e-04
Epoch 101/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0413 - acc: 0.9885 - val_loss: 0.0478 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 102/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0408 - acc: 0.9893 - val_loss: 0.0418 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 103/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0403 - acc: 0.9896 - val_loss: 0.0454 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 104/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0406 - acc: 0.9881 - val_loss: 0.0420 - val_acc: 0.9887 - lr: 1.0000e-04
Epoch 105/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0410 - acc: 0.9883 - val_loss: 0.0502 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 106/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0410 - acc: 0.9889 - val_loss: 0.0394 - val_acc: 0.9900 - lr: 1.0000e-04
Epoch 107/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0401 - acc: 0.9887 - val_loss: 0.0515 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 108/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0400 - acc: 0.9900 - val_loss: 0.0409 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 109/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0405 - acc: 0.9892 - val_loss: 0.0411 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 110/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0407 - acc: 0.9879 - val_loss: 0.0589 - val_acc: 0.9688 - lr: 1.0000e-04
Epoch 111/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0392 - acc: 0.9896 - val_loss: 0.0581 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 112/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0394 - acc: 0.9896 - val_loss: 0.0492 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 113/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0395 - acc: 0.9890 - val_loss: 0.0449 - val_acc: 0.9837 - lr: 1.0000e-04
Epoch 114/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0396 - acc: 0.9878 - val_loss: 0.0389 - val_acc: 0.9875 - lr: 1.0000e-04
Epoch 115/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0393 - acc: 0.9901 - val_loss: 0.0464 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 116/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0395 - acc: 0.9893 - val_loss: 0.0502 - val_acc: 0.9725 - lr: 1.0000e-04
Epoch 117/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0390 - acc: 0.9885 - val_loss: 0.0448 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 118/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0385 - acc: 0.9899 - val_loss: 0.0396 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 119/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0391 - acc: 0.9900 - val_loss: 0.0409 - val_acc: 0.9837 - lr: 1.0000e-04
Epoch 120/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0386 - acc: 0.9914 - val_loss: 0.0514 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 121/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0392 - acc: 0.9889 - val_loss: 0.0431 - val_acc: 0.9812 - lr: 1.0000e-04
Epoch 122/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0389 - acc: 0.9894 - val_loss: 0.0447 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 123/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0382 - acc: 0.9894 - val_loss: 0.0507 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 124/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0384 - acc: 0.9899 - val_loss: 0.0406 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 125/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0378 - acc: 0.9901 - val_loss: 0.0470 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 126/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0384 - acc: 0.9889 - val_loss: 0.0460 - val_acc: 0.9812 - lr: 1.0000e-04
Epoch 127/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0383 - acc: 0.9889 - val_loss: 0.0390 - val_acc: 0.9887 - lr: 1.0000e-04
Epoch 128/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0381 - acc: 0.9875 - val_loss: 0.0396 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 129/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0381 - acc: 0.9894 - val_loss: 0.0554 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 130/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0378 - acc: 0.9889 - val_loss: 0.0478 - val_acc: 0.9725 - lr: 1.0000e-04
Epoch 131/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0373 - acc: 0.9903 - val_loss: 0.0416 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 132/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0383 - acc: 0.9893 - val_loss: 0.0394 - val_acc: 0.9900 - lr: 1.0000e-04
Epoch 133/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0372 - acc: 0.9899 - val_loss: 0.0383 - val_acc: 0.9900 - lr: 1.0000e-04
Epoch 134/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0373 - acc: 0.9901 - val_loss: 0.0469 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 135/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0371 - acc: 0.9897 - val_loss: 0.0401 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 136/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0377 - acc: 0.9890 - val_loss: 0.0436 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 137/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0368 - acc: 0.9894 - val_loss: 0.0405 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 138/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0363 - acc: 0.9897 - val_loss: 0.0450 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 139/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0371 - acc: 0.9900 - val_loss: 0.0392 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 140/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0370 - acc: 0.9899 - val_loss: 0.0474 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 141/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0363 - acc: 0.9892 - val_loss: 0.0386 - val_acc: 0.9812 - lr: 1.0000e-04
Epoch 142/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0371 - acc: 0.9889 - val_loss: 0.0436 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 143/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0365 - acc: 0.9908 - val_loss: 0.0487 - val_acc: 0.9737 - lr: 1.0000e-04
Epoch 144/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0366 - acc: 0.9892 - val_loss: 0.0380 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 145/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0363 - acc: 0.9900 - val_loss: 0.0393 - val_acc: 0.9837 - lr: 1.0000e-04
Epoch 146/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0371 - acc: 0.9899 - val_loss: 0.0374 - val_acc: 0.9887 - lr: 1.0000e-04
Epoch 147/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0362 - acc: 0.9894 - val_loss: 0.0489 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 148/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0359 - acc: 0.9894 - val_loss: 0.0459 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 149/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0360 - acc: 0.9899 - val_loss: 0.0328 - val_acc: 0.9937 - lr: 1.0000e-04
Epoch 150/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0357 - acc: 0.9892 - val_loss: 0.0459 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 151/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0351 - acc: 0.9896 - val_loss: 0.0425 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 152/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0365 - acc: 0.9885 - val_loss: 0.0402 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 153/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0350 - acc: 0.9906 - val_loss: 0.0390 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 154/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0362 - acc: 0.9894 - val_loss: 0.0378 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 155/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0365 - acc: 0.9878 - val_loss: 0.0366 - val_acc: 0.9900 - lr: 1.0000e-04
Epoch 156/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0352 - acc: 0.9907 - val_loss: 0.0386 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 157/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0351 - acc: 0.9897 - val_loss: 0.0435 - val_acc: 0.9762 - lr: 1.0000e-04
Epoch 158/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0350 - acc: 0.9910 - val_loss: 0.0530 - val_acc: 0.9700 - lr: 1.0000e-04
Epoch 159/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0345 - acc: 0.9904 - val_loss: 0.0379 - val_acc: 0.9812 - lr: 1.0000e-04
Epoch 160/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0359 - acc: 0.9886 - val_loss: 0.0375 - val_acc: 0.9887 - lr: 1.0000e-04
Epoch 161/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0347 - acc: 0.9906 - val_loss: 0.0394 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 162/2000
720/720 [==============================] - 3s 4ms/step - loss: 0.0345 - acc: 0.9900 - val_loss: 0.0400 - val_acc: 0.9875 - lr: 1.0000e-04
Epoch 163/2000
720/720 [==============================] - 3s 4ms/step - loss: 0.0346 - acc: 0.9907 - val_loss: 0.0372 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 164/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0348 - acc: 0.9892 - val_loss: 0.0340 - val_acc: 0.9925 - lr: 1.0000e-04
Epoch 165/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0346 - acc: 0.9897 - val_loss: 0.0386 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 166/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0356 - acc: 0.9887 - val_loss: 0.0398 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 167/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0347 - acc: 0.9904 - val_loss: 0.0436 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 168/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0345 - acc: 0.9897 - val_loss: 0.0350 - val_acc: 0.9875 - lr: 1.0000e-04
Epoch 169/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0333 - acc: 0.9899 - val_loss: 0.0433 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 170/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0345 - acc: 0.9890 - val_loss: 0.0370 - val_acc: 0.9837 - lr: 1.0000e-04
Epoch 171/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0343 - acc: 0.9897 - val_loss: 0.0417 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 172/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0342 - acc: 0.9896 - val_loss: 0.0353 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 173/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0348 - acc: 0.9904 - val_loss: 0.0407 - val_acc: 0.9812 - lr: 1.0000e-04
Epoch 174/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0334 - acc: 0.9912 - val_loss: 0.0405 - val_acc: 0.9812 - lr: 1.0000e-04
Epoch 175/2000
720/720 [==============================] - 3s 4ms/step - loss: 0.0338 - acc: 0.9912 - val_loss: 0.0392 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 176/2000
720/720 [==============================] - 4s 6ms/step - loss: 0.0341 - acc: 0.9892 - val_loss: 0.0326 - val_acc: 0.9887 - lr: 1.0000e-04
Epoch 177/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0339 - acc: 0.9907 - val_loss: 0.0433 - val_acc: 0.9775 - lr: 1.0000e-04
Epoch 178/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0328 - acc: 0.9896 - val_loss: 0.0374 - val_acc: 0.9850 - lr: 1.0000e-04
Epoch 179/2000
720/720 [==============================] - 5s 8ms/step - loss: 0.0339 - acc: 0.9889 - val_loss: 0.0330 - val_acc: 0.9887 - lr: 1.0000e-04
Epoch 180/2000
720/720 [==============================] - 3s 4ms/step - loss: 0.0330 - acc: 0.9907 - val_loss: 0.0345 - val_acc: 0.9912 - lr: 1.0000e-04
Epoch 181/2000
720/720 [==============================] - 4s 6ms/step - loss: 0.0335 - acc: 0.9903 - val_loss: 0.0344 - val_acc: 0.9887 - lr: 1.0000e-04
Epoch 182/2000
720/720 [==============================] - 4s 5ms/step - loss: 0.0325 - acc: 0.9915 - val_loss: 0.0348 - val_acc: 0.9875 - lr: 1.0000e-04
Epoch 183/2000
720/720 [==============================] - 2s 3ms/step - loss: 0.0334 - acc: 0.9904 - val_loss: 0.0411 - val_acc: 0.9787 - lr: 1.0000e-04
Epoch 184/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0336 - acc: 0.9906 - val_loss: 0.0375 - val_acc: 0.9837 - lr: 1.0000e-04
Epoch 185/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0334 - acc: 0.9903 - val_loss: 0.0397 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 186/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0329 - acc: 0.9907 - val_loss: 0.0452 - val_acc: 0.9750 - lr: 1.0000e-04
Epoch 187/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0334 - acc: 0.9908 - val_loss: 0.0334 - val_acc: 0.9862 - lr: 1.0000e-04
Epoch 188/2000
720/720 [==============================] - 3s 5ms/step - loss: 0.0332 - acc: 0.9906 - val_loss: 0.0378 - val_acc: 0.9825 - lr: 1.0000e-04
Epoch 189/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0329 - acc: 0.9904 - val_loss: 0.0425 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 190/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0328 - acc: 0.9901 - val_loss: 0.0316 - val_acc: 0.9912 - lr: 1.0000e-04
Epoch 191/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0330 - acc: 0.9911 - val_loss: 0.0463 - val_acc: 0.9800 - lr: 1.0000e-04
Epoch 192/2000
720/720 [==============================] - 2s 2ms/step - loss: 0.0326 - acc: 0.9908 - val_loss: 0.0391 - val_acc: 0.9825 - lr: 1.0000e-04

Evaluation¶

In [ ]:
fig, ax = plt.subplots(ncols=2, figsize=(16, 6))

ax[0].plot(history.history['loss'], label='Training loss')
ax[0].plot(history.history['val_loss'], label='Validation loss')

ax[0].legend()
ax[0].set(
    ylabel='Loss',
    xlabel='Epoch'
)

ax[1].plot(history.history['acc'], label='Training accuracy')
ax[1].plot(history.history['val_acc'], label='Validation accuracy')

ax[1].set(
    ylabel='Accuracy',
    xlabel='Epoch'
)

plt.legend()
plt.tight_layout()
plt.show()
In [ ]:
predictions = [np.argmax(p) for p in mlp.predict(x_test)]
63/63 [==============================] - 0s 1ms/step
In [ ]:
print(classification_report(y_test, predictions))
              precision    recall  f1-score   support

           0       0.99      0.98      0.98       884
           1       0.98      0.99      0.98       279
           2       1.00      0.98      0.99       279
           3       0.98      1.00      0.99       279
           4       0.99      1.00      0.99       279

    accuracy                           0.99      2000
   macro avg       0.99      0.99      0.99      2000
weighted avg       0.99      0.99      0.99      2000

In [ ]:
fig, ax = plt.subplots(figsize=(14, 8))

ConfusionMatrixDisplay(confusion_matrix(predictions, y_test)).plot(values_format='.0f', ax=ax)

ax.set(
    title='Confusion Matrix',
    xlabel='True Labels',
    ylabel='Predicted Labels'
)

plt.tight_layout()
plt.show()
In [ ]:
fig, ax = plt.subplots(figsize=(12, 12))

ax.scatter(x_test[:,0], x_test[:,1], c=y_test, marker='d')

ax.scatter(x_test[:,0], x_test[:,1], c=predictions, marker='x')

ax.set(
    title='Predictions',
    xlabel='$x$',
    ylabel='$y$'
)


plt.show()

Questão 3¶

Considere uma rede deep learning convolutiva (treinada) aplicada à classificação de padrões em imagens. A base de dados considerada é a CIFAR-10 (pesquise). A referida base de dados consiste de 60 mil imagens coloridas de 32x32 pixels, com 50 mil para treino e 10 mil para teste. As imagens estão divididas em 10 classes, a saber: avião, navio, cami- nhão, automóvel, sapo, pássaro, cachorro, gato, cavalo e cervo. Cada imagem possui apenas um dos objetos da classe de interesse, podendo estar parcialmente obstruído por outros ob- jetos que não pertençam a esse conjunto. Apresente os resultados da classificação em uma matriz de confusão.

Este trabalho foi baseado no trabalho do CODERONIN1 do Kaggle Link: https://www.kaggle.com/code/adi160/cifar-10-keras-transfer-learning

In [24]:
from sklearn.utils.multiclass import unique_labels
import os
import matplotlib.image as mpimg
import seaborn as sns
from keras.layers import Flatten,Dense,BatchNormalization,Activation,Dropout
from keras.utils import to_categorical
from tensorflow.keras.callbacks import EarlyStopping

# library for transfer learning
from keras.applications import VGG19,ResNet50

# data augumentation
from keras.preprocessing.image import ImageDataGenerator

# import dataset
from keras.datasets import cifar10

Load dataset¶

In [16]:
# load dataset and divide in train and teste datasets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()
In [17]:
print(f"shape x train: {x_train.shape}")
print(f"shape y train: {y_train.shape}")
print(f"shape x test: {x_test.shape}")
print(f"shape y test: {y_test.shape}")
shape x train: (50000, 32, 32, 3)
shape y train: (50000, 1)
shape x test: (10000, 32, 32, 3)
shape y test: (10000, 1)
In [19]:
y_train=to_categorical(y_train)
y_test=to_categorical(y_test)
In [20]:
print(f"shape x train: {x_train.shape}")
print(f"shape y train: {y_train.shape}")
print(f"shape x test: {x_test.shape}")
print(f"shape y test: {y_test.shape}")
shape x train: (50000, 32, 32, 3)
shape y train: (50000, 10)
shape x test: (10000, 32, 32, 3)
shape y test: (10000, 10)

Data Augmentation¶

In [30]:
# Let's instantiate the object
train_generator = ImageDataGenerator(
                                    rotation_range=2, 
                                    horizontal_flip=True,
                                    zoom_range=.1 )

test_generator = ImageDataGenerator(
                                    rotation_range=2, 
                                    horizontal_flip= True,
                                    zoom_range=.1)
In [31]:
#Fit the augmentation method to the data
train_generator.fit(x_train)
test_generator.fit(x_test)

Transfer Learning - VGG19¶

In [39]:
lrr= ReduceLROnPlateau(
                       monitor='val_accuracy', #Metric to be measured
                       factor=.01, #Factor by which learning rate will be reduced
                       patience=3,  #No. of epochs after which if there is no improvement in the val_acc, the learning rate is reduced
                       min_lr=1e-5) #The minimum learning rate

Importing¶

In [40]:
# The first base model used is VGG19. 
# The pretrained weights from the imagenet challenge are used
base_vgg19 = VGG19(include_top=False,
                   input_shape=(32,32,3),
                   classes=y_train.shape[1],
                   weights='imagenet')

base_vgg19.summary()
Model: "vgg19"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_4 (InputLayer)        [(None, 32, 32, 3)]       0         
                                                                 
 block1_conv1 (Conv2D)       (None, 32, 32, 64)        1792      
                                                                 
 block1_conv2 (Conv2D)       (None, 32, 32, 64)        36928     
                                                                 
 block1_pool (MaxPooling2D)  (None, 16, 16, 64)        0         
                                                                 
 block2_conv1 (Conv2D)       (None, 16, 16, 128)       73856     
                                                                 
 block2_conv2 (Conv2D)       (None, 16, 16, 128)       147584    
                                                                 
 block2_pool (MaxPooling2D)  (None, 8, 8, 128)         0         
                                                                 
 block3_conv1 (Conv2D)       (None, 8, 8, 256)         295168    
                                                                 
 block3_conv2 (Conv2D)       (None, 8, 8, 256)         590080    
                                                                 
 block3_conv3 (Conv2D)       (None, 8, 8, 256)         590080    
                                                                 
 block3_conv4 (Conv2D)       (None, 8, 8, 256)         590080    
                                                                 
 block3_pool (MaxPooling2D)  (None, 4, 4, 256)         0         
                                                                 
 block4_conv1 (Conv2D)       (None, 4, 4, 512)         1180160   
                                                                 
 block4_conv2 (Conv2D)       (None, 4, 4, 512)         2359808   
                                                                 
 block4_conv3 (Conv2D)       (None, 4, 4, 512)         2359808   
                                                                 
 block4_conv4 (Conv2D)       (None, 4, 4, 512)         2359808   
                                                                 
 block4_pool (MaxPooling2D)  (None, 2, 2, 512)         0         
                                                                 
 block5_conv1 (Conv2D)       (None, 2, 2, 512)         2359808   
                                                                 
 block5_conv2 (Conv2D)       (None, 2, 2, 512)         2359808   
                                                                 
 block5_conv3 (Conv2D)       (None, 2, 2, 512)         2359808   
                                                                 
 block5_conv4 (Conv2D)       (None, 2, 2, 512)         2359808   
                                                                 
 block5_pool (MaxPooling2D)  (None, 1, 1, 512)         0         
                                                                 
=================================================================
Total params: 20,024,384
Trainable params: 20,024,384
Non-trainable params: 0
_________________________________________________________________
In [41]:
vgg19 = Sequential()
vgg19.add(base_vgg19) 
vgg19.add(Flatten()) 

#Add the Dense layers along with activation and batch normalization
vgg19.add(Dense(1024,activation=('relu'),input_dim=512))
vgg19.add(Dense(512,activation=('relu'))) 
vgg19.add(Dense(256,activation=('relu'))) 
#model_1.add(Dropout(.3)) 
vgg19.add(Dense(128,activation=('relu')))
#model_1.add(Dropout(.2))
vgg19.add(Dense(10,activation=('softmax'))) 

vgg19.summary()
Model: "sequential_3"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 vgg19 (Functional)          (None, 1, 1, 512)         20024384  
                                                                 
 flatten_3 (Flatten)         (None, 512)               0         
                                                                 
 dense_15 (Dense)            (None, 1024)              525312    
                                                                 
 dense_16 (Dense)            (None, 512)               524800    
                                                                 
 dense_17 (Dense)            (None, 256)               131328    
                                                                 
 dense_18 (Dense)            (None, 128)               32896     
                                                                 
 dense_19 (Dense)            (None, 10)                1290      
                                                                 
=================================================================
Total params: 21,240,010
Trainable params: 21,240,010
Non-trainable params: 0
_________________________________________________________________

Training¶

In [42]:
# hyperparameters
batch_size= 100
epochs=50

learn_rate=.001

sgd=SGD(lr=learn_rate,momentum=.9,nesterov=False)
adam=Adam(lr=learn_rate, beta_1=0.9, beta_2=0.999, 
          epsilon=None, decay=0.0, amsgrad=False)

vgg19.compile(optimizer=sgd,
                    loss='categorical_crossentropy',
                    metrics=['accuracy'])
In [43]:
vgg19.fit_generator(train_generator.flow(x_train,y_train,batch_size=batch_size),
                      epochs=epochs,
                      steps_per_epoch=x_train.shape[0]//batch_size,
                      validation_data=test_generator.flow(x_test,y_test,batch_size=batch_size),validation_steps=250,
                      callbacks=[lrr],verbose=1)
Epoch 1/50
/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py:5: UserWarning: `Model.fit_generator` is deprecated and will be removed in a future version. Please use `Model.fit`, which supports generators.
  """
500/500 [==============================] - ETA: 0s - loss: 1.6137 - accuracy: 0.4008
WARNING:tensorflow:Your input ran out of data; interrupting training. Make sure that your dataset or generator can generate at least `steps_per_epoch * epochs` batches (in this case, 250 batches). You may need to use the repeat() function when building your dataset.
500/500 [==============================] - 37s 73ms/step - loss: 1.6137 - accuracy: 0.4008 - val_loss: 1.0790 - val_accuracy: 0.6240 - lr: 0.0010
Epoch 2/50
500/500 [==============================] - ETA: 0s - loss: 0.8359 - accuracy: 0.7123
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.8359 - accuracy: 0.7123 - lr: 0.0010
Epoch 3/50
500/500 [==============================] - ETA: 0s - loss: 0.6465 - accuracy: 0.7799
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 33s 65ms/step - loss: 0.6465 - accuracy: 0.7799 - lr: 0.0010
Epoch 4/50
500/500 [==============================] - ETA: 0s - loss: 0.5492 - accuracy: 0.8131
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.5492 - accuracy: 0.8131 - lr: 0.0010
Epoch 5/50
500/500 [==============================] - ETA: 0s - loss: 0.4832 - accuracy: 0.8365
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.4832 - accuracy: 0.8365 - lr: 0.0010
Epoch 6/50
500/500 [==============================] - ETA: 0s - loss: 0.4242 - accuracy: 0.8548
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.4242 - accuracy: 0.8548 - lr: 0.0010
Epoch 7/50
500/500 [==============================] - ETA: 0s - loss: 0.3861 - accuracy: 0.8667
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.3861 - accuracy: 0.8667 - lr: 0.0010
Epoch 8/50
500/500 [==============================] - ETA: 0s - loss: 0.3509 - accuracy: 0.8775
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 35s 70ms/step - loss: 0.3509 - accuracy: 0.8775 - lr: 0.0010
Epoch 9/50
500/500 [==============================] - ETA: 0s - loss: 0.3144 - accuracy: 0.8912
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.3144 - accuracy: 0.8912 - lr: 0.0010
Epoch 10/50
500/500 [==============================] - ETA: 0s - loss: 0.2832 - accuracy: 0.9022
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.2832 - accuracy: 0.9022 - lr: 0.0010
Epoch 11/50
500/500 [==============================] - ETA: 0s - loss: 0.2564 - accuracy: 0.9116
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.2564 - accuracy: 0.9116 - lr: 0.0010
Epoch 12/50
500/500 [==============================] - ETA: 0s - loss: 0.2257 - accuracy: 0.9214
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.2257 - accuracy: 0.9214 - lr: 0.0010
Epoch 13/50
500/500 [==============================] - ETA: 0s - loss: 0.2134 - accuracy: 0.9262
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.2134 - accuracy: 0.9262 - lr: 0.0010
Epoch 14/50
500/500 [==============================] - ETA: 0s - loss: 0.1928 - accuracy: 0.9343
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.1928 - accuracy: 0.9343 - lr: 0.0010
Epoch 15/50
500/500 [==============================] - ETA: 0s - loss: 0.1731 - accuracy: 0.9400
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.1731 - accuracy: 0.9400 - lr: 0.0010
Epoch 16/50
500/500 [==============================] - ETA: 0s - loss: 0.1550 - accuracy: 0.9474
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.1550 - accuracy: 0.9474 - lr: 0.0010
Epoch 17/50
500/500 [==============================] - ETA: 0s - loss: 0.1412 - accuracy: 0.9514
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.1412 - accuracy: 0.9514 - lr: 0.0010
Epoch 18/50
500/500 [==============================] - ETA: 0s - loss: 0.1307 - accuracy: 0.9560
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.1307 - accuracy: 0.9560 - lr: 0.0010
Epoch 19/50
500/500 [==============================] - ETA: 0s - loss: 0.1171 - accuracy: 0.9605
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.1171 - accuracy: 0.9605 - lr: 0.0010
Epoch 20/50
500/500 [==============================] - ETA: 0s - loss: 0.1071 - accuracy: 0.9633
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.1071 - accuracy: 0.9633 - lr: 0.0010
Epoch 21/50
500/500 [==============================] - ETA: 0s - loss: 0.1004 - accuracy: 0.9654
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.1004 - accuracy: 0.9654 - lr: 0.0010
Epoch 22/50
500/500 [==============================] - ETA: 0s - loss: 0.0919 - accuracy: 0.9687
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0919 - accuracy: 0.9687 - lr: 0.0010
Epoch 23/50
500/500 [==============================] - ETA: 0s - loss: 0.0862 - accuracy: 0.9704
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 34s 68ms/step - loss: 0.0862 - accuracy: 0.9704 - lr: 0.0010
Epoch 24/50
500/500 [==============================] - ETA: 0s - loss: 0.0761 - accuracy: 0.9743
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0761 - accuracy: 0.9743 - lr: 0.0010
Epoch 25/50
500/500 [==============================] - ETA: 0s - loss: 0.0703 - accuracy: 0.9761
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0703 - accuracy: 0.9761 - lr: 0.0010
Epoch 26/50
500/500 [==============================] - ETA: 0s - loss: 0.0664 - accuracy: 0.9768
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0664 - accuracy: 0.9768 - lr: 0.0010
Epoch 27/50
500/500 [==============================] - ETA: 0s - loss: 0.0601 - accuracy: 0.9798
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0601 - accuracy: 0.9798 - lr: 0.0010
Epoch 28/50
500/500 [==============================] - ETA: 0s - loss: 0.0578 - accuracy: 0.9802
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0578 - accuracy: 0.9802 - lr: 0.0010
Epoch 29/50
500/500 [==============================] - ETA: 0s - loss: 0.0568 - accuracy: 0.9805
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0568 - accuracy: 0.9805 - lr: 0.0010
Epoch 30/50
500/500 [==============================] - ETA: 0s - loss: 0.0488 - accuracy: 0.9842
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0488 - accuracy: 0.9842 - lr: 0.0010
Epoch 31/50
500/500 [==============================] - ETA: 0s - loss: 0.0475 - accuracy: 0.9842
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0475 - accuracy: 0.9842 - lr: 0.0010
Epoch 32/50
500/500 [==============================] - ETA: 0s - loss: 0.0455 - accuracy: 0.9844
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0455 - accuracy: 0.9844 - lr: 0.0010
Epoch 33/50
500/500 [==============================] - ETA: 0s - loss: 0.0433 - accuracy: 0.9855
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0433 - accuracy: 0.9855 - lr: 0.0010
Epoch 34/50
500/500 [==============================] - ETA: 0s - loss: 0.0404 - accuracy: 0.9866
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0404 - accuracy: 0.9866 - lr: 0.0010
Epoch 35/50
500/500 [==============================] - ETA: 0s - loss: 0.0433 - accuracy: 0.9849
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0433 - accuracy: 0.9849 - lr: 0.0010
Epoch 36/50
500/500 [==============================] - ETA: 0s - loss: 0.0367 - accuracy: 0.9876
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0367 - accuracy: 0.9876 - lr: 0.0010
Epoch 37/50
500/500 [==============================] - ETA: 0s - loss: 0.0306 - accuracy: 0.9895
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0306 - accuracy: 0.9895 - lr: 0.0010
Epoch 38/50
500/500 [==============================] - ETA: 0s - loss: 0.0300 - accuracy: 0.9896
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0300 - accuracy: 0.9896 - lr: 0.0010
Epoch 39/50
500/500 [==============================] - ETA: 0s - loss: 0.0328 - accuracy: 0.9892
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0328 - accuracy: 0.9892 - lr: 0.0010
Epoch 40/50
500/500 [==============================] - ETA: 0s - loss: 0.0263 - accuracy: 0.9912
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0263 - accuracy: 0.9912 - lr: 0.0010
Epoch 41/50
500/500 [==============================] - ETA: 0s - loss: 0.0276 - accuracy: 0.9909
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0276 - accuracy: 0.9909 - lr: 0.0010
Epoch 42/50
500/500 [==============================] - ETA: 0s - loss: 0.0274 - accuracy: 0.9908
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 33s 65ms/step - loss: 0.0274 - accuracy: 0.9908 - lr: 0.0010
Epoch 43/50
500/500 [==============================] - ETA: 0s - loss: 0.0267 - accuracy: 0.9907
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0267 - accuracy: 0.9907 - lr: 0.0010
Epoch 44/50
500/500 [==============================] - ETA: 0s - loss: 0.0308 - accuracy: 0.9895
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0308 - accuracy: 0.9895 - lr: 0.0010
Epoch 45/50
500/500 [==============================] - ETA: 0s - loss: 0.0257 - accuracy: 0.9914
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0257 - accuracy: 0.9914 - lr: 0.0010
Epoch 46/50
500/500 [==============================] - ETA: 0s - loss: 0.0220 - accuracy: 0.9930
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0220 - accuracy: 0.9930 - lr: 0.0010
Epoch 47/50
500/500 [==============================] - ETA: 0s - loss: 0.0216 - accuracy: 0.9928
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 65ms/step - loss: 0.0216 - accuracy: 0.9928 - lr: 0.0010
Epoch 48/50
500/500 [==============================] - ETA: 0s - loss: 0.0211 - accuracy: 0.9928
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0211 - accuracy: 0.9928 - lr: 0.0010
Epoch 49/50
500/500 [==============================] - ETA: 0s - loss: 0.0216 - accuracy: 0.9926
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 33s 65ms/step - loss: 0.0216 - accuracy: 0.9926 - lr: 0.0010
Epoch 50/50
500/500 [==============================] - ETA: 0s - loss: 0.0230 - accuracy: 0.9920
WARNING:tensorflow:Learning rate reduction is conditioned on metric `val_accuracy` which is not available. Available metrics are: loss,accuracy,lr
500/500 [==============================] - 32s 64ms/step - loss: 0.0230 - accuracy: 0.9920 - lr: 0.0010
Out[43]:
<keras.callbacks.History at 0x7f660232e310>
In [44]:
fig, ax = plt.subplots(ncols=2, figsize=(16, 6))

ax[0].plot(vgg19.history.history['loss'], label='Training loss')
ax[0].plot(vgg19.history.history['val_loss'], label='Validation loss')

ax[0].legend()
ax[0].set(
    ylabel='Loss',
    xlabel='Epoch'
)

ax[1].plot(vgg19.history.history['accuracy'], label='Training accuracy')
ax[1].plot(vgg19.history.history['val_accuracy'], label='Validation accuracy')

ax[1].set(
    ylabel='Accuracy',
    xlabel='Epoch'
)

plt.legend()
plt.tight_layout()
plt.show()

Evaluation¶

In [45]:
def plot_confusion_matrix(y_true, y_pred, classes,
                          normalize=False,
                          title=None,
                          cmap=plt.cm.Blues):
    """
    This function prints and plots the confusion matrix.
    Normalization can be applied by setting `normalize=True`.
    """
    if not title:
        if normalize:
            title = 'Normalized confusion matrix'
        else:
            title = 'Confusion matrix, without normalization'

    # Compute confusion matrix
    cm = confusion_matrix(y_true, y_pred)
    if normalize:
        cm = cm.astype('float') / cm.sum(axis=1)[:, np.newaxis]
        print("Normalized confusion matrix")
    else:
        print('Confusion matrix, without normalization')

#     print(cm)

    fig, ax = plt.subplots(figsize=(7,7))
    im = ax.imshow(cm, interpolation='nearest', cmap=cmap)
    ax.figure.colorbar(im, ax=ax)
    # We want to show all ticks...
    ax.set(xticks=np.arange(cm.shape[1]),
           yticks=np.arange(cm.shape[0]),
           # ... and label them with the respective list entries
           xticklabels=classes, yticklabels=classes,
           title=title,
           ylabel='True label',
           xlabel='Predicted label')

    # Rotate the tick labels and set their alignment.
    plt.setp(ax.get_xticklabels(), rotation=45, ha="right",
             rotation_mode="anchor")
    # Loop over data dimensions and create text annotations.
    fmt = '.2f' if normalize else 'd'
    thresh = cm.max() / 2.
    for i in range(cm.shape[0]):
        for j in range(cm.shape[1]):
            ax.text(j, i, format(cm[i, j], fmt),
                    ha="center", va="center",
                    color="white" if cm[i, j] > thresh else "black")
    fig.tight_layout()
    return ax


np.set_printoptions(precision=2)
In [46]:
y_pred = vgg19.predict(x_test)
y_pred = np.argmax(y_pred,axis=1)
y_true=np.argmax(y_test,axis=1)

#Compute the confusion matrix
confusion_mtx = confusion_matrix(y_true,y_pred)
313/313 [==============================] - 4s 10ms/step
In [47]:
class_names=['airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck']
In [48]:
# Plot non-normalized confusion matrix
plot_confusion_matrix(y_true, y_pred, classes=class_names,
                      title='Confusion matrix, without normalization')
Confusion matrix, without normalization
Out[48]:
<matplotlib.axes._subplots.AxesSubplot at 0x7f64c4167a10>
In [49]:
# Plot normalized confusion matrix
plot_confusion_matrix(y_true, y_pred, classes=class_names, normalize=True,
                      title='Normalized confusion matrix')
# plt.show()
Normalized confusion matrix
Out[49]:
<matplotlib.axes._subplots.AxesSubplot at 0x7f64ba2477d0>
In [50]:
print(classification_report(y_true, y_pred, target_names=class_names))
              precision    recall  f1-score   support

    airplane       0.89      0.89      0.89      1000
  automobile       0.91      0.95      0.93      1000
        bird       0.90      0.80      0.85      1000
         cat       0.77      0.70      0.74      1000
        deer       0.86      0.87      0.86      1000
         dog       0.78      0.82      0.80      1000
        frog       0.89      0.90      0.90      1000
       horse       0.89      0.90      0.89      1000
        ship       0.93      0.93      0.93      1000
       truck       0.87      0.94      0.90      1000

    accuracy                           0.87     10000
   macro avg       0.87      0.87      0.87     10000
weighted avg       0.87      0.87      0.87     10000

Questão 4¶

Utilize a rede neural perceptron de múltiplas camadas do tipo NARX (rede recorrente) para fazer a predição de um passo $x^{(n+1)}$ da série temporal $x(n) = 1 + cos(n + cos^2(n))$, n=0,1,2,3,.... Gere inicialmente um conjunto de amostras para o treinamento, definindo o erro de predição como $e^{(n+1)}=x(n+1)-x^{(n+1)}$. Avalie o desempenho mostrando a curva a série temporal, a curva de predição o e curva do erro de predição.

In [1]:
from tensorflow.keras.layers import Dense, Reshape, Flatten, Dropout, Activation, BatchNormalization, LSTM, Embedding, Input
from tensorflow.keras.preprocessing.sequence import TimeseriesGenerator
from tensorflow.keras.callbacks import EarlyStopping, ReduceLROnPlateau
In [2]:
# defining function
def f(x):
  return 1 + np.cos(x + (np.cos(x))**2)
In [5]:
x = np.linspace(0, 100, 10000)
y = [f(i) for i in x]
In [6]:
# plot data
points = 10000
plt.plot(x[:points], y[:points])
plt.show()
In [8]:
# splitting data into training and testing
test_size = 2000
x_train = x[:-test_size]
y_train = y[:-test_size]
x_test = x[-test_size:]
y_test = y[-test_size:]
In [9]:
# plot data
fig, axes = plt.subplots(ncols=2, figsize=(24, 5))
axes[0].plot(x_train, y_train, label='train')
axes[0].plot(x_test, y_test, label='test')
axes[0].set_title('Data')
axes[0].legend()
axes[1].plot(x_train, y_train, label='train')
axes[1].plot(x_test, y_test, label='test')
axes[1].set_title('Data')
axes[1].legend()
axes[1].set_xlim(0, 100)
axes[1].set_ylim(0, 2)
plt.show()
In [10]:
sequence_x = list(TimeseriesGenerator(x_train, x_train, 4, batch_size=1))
sequence_y = list(TimeseriesGenerator(x_train[4:], x_train[4:], 3, batch_size=1))

train_seqs = []
y_train = []
for (x_seq, next_x), (next_seq, _) in zip(sequence_x, sequence_y):
    seq = np.append(x_seq.reshape(4,), next_seq.reshape(3, ))
    
    train_seqs.append(seq)
    y_train.append(next_x)

train_seqs = np.array(train_seqs)
y_train = np.array(y_train)
In [11]:
sequence_x = list(TimeseriesGenerator(x_test, x_test, 4, batch_size=1))
sequence_y = list(TimeseriesGenerator(x_test[4:], x_test[4:], 3, batch_size=1))

test_seqs = []
y_test = []
for (x_seq, next_x), (next_seq, _) in zip(sequence_x, sequence_y):
    seq = np.append(x_seq.reshape(4,), next_seq.reshape(3, ))
    
    test_seqs.append(seq)
    y_test.append(next_x)

test_seqs = np.array(test_seqs)
y_test = np.array(y_test)
In [12]:
# building the model
model = Sequential([
    LSTM(128, input_shape=(7, 1), return_sequences=True),
    LSTM(64),
    Dense(1)
])

model.compile(loss="mean_squared_error", optimizer="adam")

model.summary()
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 lstm (LSTM)                 (None, 7, 128)            66560     
                                                                 
 lstm_1 (LSTM)               (None, 64)                49408     
                                                                 
 dense (Dense)               (None, 1)                 65        
                                                                 
=================================================================
Total params: 116,033
Trainable params: 116,033
Non-trainable params: 0
_________________________________________________________________
In [13]:
# Training the model
history = model.fit(
    train_seqs, y_train,
    validation_split=0.1,
    batch_size=8,
    epochs=100,
    shuffle=True,
    callbacks=[
        EarlyStopping(monitor='val_loss', patience=5),
        ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=5, min_lr=0.0001)
    ]
)
Epoch 1/100
900/900 [==============================] - 18s 15ms/step - loss: 396.9484 - val_loss: 888.2201 - lr: 0.0010
Epoch 2/100
900/900 [==============================] - 12s 14ms/step - loss: 30.8379 - val_loss: 269.7425 - lr: 0.0010
Epoch 3/100
900/900 [==============================] - 12s 14ms/step - loss: 3.9125 - val_loss: 116.8918 - lr: 0.0010
Epoch 4/100
900/900 [==============================] - 12s 14ms/step - loss: 0.6845 - val_loss: 65.4554 - lr: 0.0010
Epoch 5/100
900/900 [==============================] - 12s 13ms/step - loss: 0.3236 - val_loss: 44.9251 - lr: 0.0010
Epoch 6/100
900/900 [==============================] - 12s 14ms/step - loss: 0.0635 - val_loss: 34.4474 - lr: 0.0010
Epoch 7/100
900/900 [==============================] - 12s 13ms/step - loss: 0.1411 - val_loss: 29.1309 - lr: 0.0010
Epoch 8/100
900/900 [==============================] - 12s 13ms/step - loss: 0.0852 - val_loss: 26.6374 - lr: 0.0010
Epoch 9/100
900/900 [==============================] - 13s 15ms/step - loss: 0.0464 - val_loss: 24.4883 - lr: 0.0010
Epoch 10/100
900/900 [==============================] - 12s 13ms/step - loss: 0.0964 - val_loss: 23.4697 - lr: 0.0010
Epoch 11/100
900/900 [==============================] - 12s 13ms/step - loss: 0.0574 - val_loss: 23.2975 - lr: 0.0010
Epoch 12/100
900/900 [==============================] - 12s 14ms/step - loss: 0.0649 - val_loss: 54.3588 - lr: 0.0010
Epoch 13/100
900/900 [==============================] - 12s 14ms/step - loss: 0.0961 - val_loss: 25.9568 - lr: 0.0010
Epoch 14/100
900/900 [==============================] - 12s 13ms/step - loss: 0.0266 - val_loss: 22.9685 - lr: 0.0010
Epoch 15/100
900/900 [==============================] - 13s 14ms/step - loss: 0.0913 - val_loss: 23.4417 - lr: 0.0010
Epoch 16/100
900/900 [==============================] - 13s 15ms/step - loss: 0.0476 - val_loss: 23.7311 - lr: 0.0010
Epoch 17/100
900/900 [==============================] - 13s 15ms/step - loss: 0.0251 - val_loss: 21.4442 - lr: 0.0010
Epoch 18/100
900/900 [==============================] - 13s 15ms/step - loss: 0.0392 - val_loss: 20.8180 - lr: 0.0010
Epoch 19/100
900/900 [==============================] - 13s 14ms/step - loss: 0.0680 - val_loss: 19.8773 - lr: 0.0010
Epoch 20/100
900/900 [==============================] - 12s 14ms/step - loss: 0.0369 - val_loss: 21.5628 - lr: 0.0010
Epoch 21/100
900/900 [==============================] - 12s 14ms/step - loss: 0.0423 - val_loss: 22.3235 - lr: 0.0010
Epoch 22/100
900/900 [==============================] - 12s 13ms/step - loss: 0.0675 - val_loss: 18.6382 - lr: 0.0010
Epoch 23/100
900/900 [==============================] - 13s 15ms/step - loss: 0.0404 - val_loss: 22.6039 - lr: 0.0010
Epoch 24/100
900/900 [==============================] - 13s 14ms/step - loss: 0.0613 - val_loss: 18.8377 - lr: 0.0010
Epoch 25/100
900/900 [==============================] - 13s 14ms/step - loss: 0.0054 - val_loss: 18.0627 - lr: 0.0010
Epoch 26/100
900/900 [==============================] - 13s 14ms/step - loss: 0.0381 - val_loss: 19.9228 - lr: 0.0010
Epoch 27/100
900/900 [==============================] - 13s 15ms/step - loss: 0.0430 - val_loss: 21.3316 - lr: 0.0010
Epoch 28/100
900/900 [==============================] - 12s 13ms/step - loss: 0.0353 - val_loss: 19.2926 - lr: 0.0010
Epoch 29/100
900/900 [==============================] - 12s 14ms/step - loss: 0.0191 - val_loss: 17.4730 - lr: 0.0010
Epoch 30/100
900/900 [==============================] - 13s 15ms/step - loss: 0.0306 - val_loss: 17.8915 - lr: 0.0010
Epoch 31/100
900/900 [==============================] - 12s 14ms/step - loss: 0.0687 - val_loss: 19.3206 - lr: 0.0010
Epoch 32/100
900/900 [==============================] - 12s 14ms/step - loss: 0.0118 - val_loss: 18.8057 - lr: 0.0010
Epoch 33/100
900/900 [==============================] - 12s 13ms/step - loss: 0.0397 - val_loss: 22.1194 - lr: 0.0010
Epoch 34/100
900/900 [==============================] - 12s 13ms/step - loss: 0.0425 - val_loss: 19.9963 - lr: 0.0010
In [14]:
# plotting loss
fig, ax = plt.subplots(figsize=(8, 6))

ax.plot(history.history['loss'], label='Training loss')
ax.plot(history.history['val_loss'], label='Validation loss')

ax.legend()
ax.set(
    ylabel='Loss',
    xlabel='Epoch'
)

plt.legend()
plt.tight_layout()
plt.show()
In [16]:
# predicting
y_pred = model.predict(test_seqs)
63/63 [==============================] - 3s 12ms/step
In [18]:
fig, axes = plt.subplots(ncols=2, figsize=(25, 6))

axes[0].plot(y_test, marker='.', label='Groundtruth')
axes[0].scatter(
    range(len(y_pred)), y_pred, 
    marker='X', edgecolors='k', 
    label='Predictions', c='#ff7f0e', 
    s=64
)

axes[0].legend()
axes[0].set(
    ylabel='$\hat{x}(n)$',
    xlabel='$n$'
)

axes[1].plot(y_test[:50], marker='.', label='Groundtruth')
axes[1].scatter(
    range(len(y_pred[:50])), y_pred[:50], 
    marker='X', edgecolors='k', 
    label='Predictions', c='#ff7f0e', 
    s=64
)

axes[1].legend()
axes[1].set(
    ylabel='$\hat{x}(n)$',
    xlabel='$n$'
)

plt.tight_layout()
plt.show()

Questão 5¶

Considere quatro distribuições gaussianas, $C_1, C_2, C_3, C_4$, em um espaço de entrada de dimensionalidade igual a oito, isto é x = ($x_1, x_2, ..., x_8)^t$ .Todas as nuvens de dados formadas têm variâncias unitária, mas centros ou vetores média são diferentes e dados por $m_1 = (0,0,0,0,0,0,0,0)^t , m_2 = (4,0,0,0,0,0,0,0)^t, m_3 = (0,0,0,4,0,0,0,0)^t, m_4 = (0,0,0,0,0,0,0,4)^t $.

Utilize uma rede de autoeconders para reduzir a dimensionalidade dos dados para duas di- mensões. O objetivo é visualizar os dados de dimensão 8 em um espaço de dimensão 2. Esboce os dados neste novo espaço.

Observação: Gere inicialmente os dados em dimensão oito para cada uma das distribuições gaussianas. Selecione o conjunto de treinamento. Defina uma rede de autoencoder com uma arquitetura por exemplo do tipo 8:2:8 ou outro equivalente com mais camadas mas que reduza para 2 dimensões. Após o treinamento faça a redução de dimensionalidade com a rede de arquitetura 8:2 por exemplo.

In [ ]:
from tensorflow.keras import Model
In [ ]:
# generate four gaussian distributions with 8 dimensions, mean m1 = (0,0,0,0,0,0,0,0), m2 = (4,0,0,0,0,0,0,0), m3 = (0,0,0,4,0,0,0,0), m4 = (0,0,0,0,0,0,0,4) and covariance matrix I
m1 = np.zeros(8)
m2 = np.array([4,0,0,0,0,0,0,0])
m3 = np.array([0,0,0,4,0,0,0,0])
m4 = np.array([0,0,0,0,0,0,0,4])
I = np.eye(8)
x1 = np.random.multivariate_normal(m1, I, 1000)
x2 = np.random.multivariate_normal(m2, I, 1000)
x3 = np.random.multivariate_normal(m3, I, 1000)
x4 = np.random.multivariate_normal(m4, I, 1000)
In [ ]:
# plot the data
fig, axes = plt.subplots(ncols=2, figsize=(25, 6))
axes[0].scatter(x1[:,0], x1[:,1], label='Class 1')
axes[0].scatter(x2[:,0], x2[:,1], label='Class 2')
axes[0].scatter(x3[:,4], x3[:,3], label='Class 3')
axes[0].scatter(x4[:,-1], x4[:,-2], label='Class 4')
axes[0].legend()
axes[0].set(
    ylabel='$x_2$',
    xlabel='$x_1$'
)
plt.show()
In [ ]:
# create an autoencoder netork to reduce the dimensionality of the data to 2 and plot the data
input_dim = Input(shape=(8,))
encoded = Dense(4, activation='relu')(input_dim)
encoded = Dense(2, activation='relu')(input_dim)
decoded = Dense(4, activation='relu')(encoded)
decoded = Dense(8, activation='linear')(encoded)
autoencoder = Model(input_dim, decoded)
encoder = Model(input_dim, encoded)
autoencoder.compile(optimizer='rmsprop', loss='mean_absolute_error')
history1 = autoencoder.fit(x1, x1, epochs=300, batch_size=256, shuffle=True, validation_split=0.1)
history2 = autoencoder.fit(x2, x2, epochs=300, batch_size=256, shuffle=True, validation_split=0.1)
history4 = autoencoder.fit(x3, x3, epochs=300, batch_size=256, shuffle=True, validation_split=0.1)
history3 = autoencoder.fit(x4, x4, epochs=300, batch_size=256, shuffle=True, validation_split=0.1)
encoded_x1 = encoder.predict(x1)
encoded_x2 = encoder.predict(x2)
encoded_x3 = encoder.predict(x3)
encoded_x4 = encoder.predict(x4)
Epoch 1/300
4/4 [==============================] - 0s 42ms/step - loss: 0.9442 - val_loss: 0.9154
Epoch 2/300
4/4 [==============================] - 0s 8ms/step - loss: 0.9352 - val_loss: 0.9069
Epoch 3/300
4/4 [==============================] - 0s 8ms/step - loss: 0.9290 - val_loss: 0.9002
Epoch 4/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9239 - val_loss: 0.8942
Epoch 5/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9192 - val_loss: 0.8886
Epoch 6/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9148 - val_loss: 0.8832
Epoch 7/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9107 - val_loss: 0.8781
Epoch 8/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9066 - val_loss: 0.8731
Epoch 9/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9028 - val_loss: 0.8683
Epoch 10/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8990 - val_loss: 0.8638
Epoch 11/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8955 - val_loss: 0.8592
Epoch 12/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8920 - val_loss: 0.8549
Epoch 13/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8886 - val_loss: 0.8506
Epoch 14/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8853 - val_loss: 0.8465
Epoch 15/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8821 - val_loss: 0.8425
Epoch 16/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8790 - val_loss: 0.8383
Epoch 17/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8759 - val_loss: 0.8344
Epoch 18/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8730 - val_loss: 0.8307
Epoch 19/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8701 - val_loss: 0.8270
Epoch 20/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8672 - val_loss: 0.8235
Epoch 21/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8645 - val_loss: 0.8201
Epoch 22/300
4/4 [==============================] - 0s 8ms/step - loss: 0.8617 - val_loss: 0.8167
Epoch 23/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8590 - val_loss: 0.8133
Epoch 24/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8564 - val_loss: 0.8104
Epoch 25/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8540 - val_loss: 0.8076
Epoch 26/300
4/4 [==============================] - 0s 8ms/step - loss: 0.8516 - val_loss: 0.8045
Epoch 27/300
4/4 [==============================] - 0s 8ms/step - loss: 0.8491 - val_loss: 0.8016
Epoch 28/300
4/4 [==============================] - 0s 8ms/step - loss: 0.8468 - val_loss: 0.7988
Epoch 29/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8445 - val_loss: 0.7961
Epoch 30/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8423 - val_loss: 0.7935
Epoch 31/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8402 - val_loss: 0.7911
Epoch 32/300
4/4 [==============================] - 0s 8ms/step - loss: 0.8381 - val_loss: 0.7887
Epoch 33/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8360 - val_loss: 0.7864
Epoch 34/300
4/4 [==============================] - 0s 16ms/step - loss: 0.8340 - val_loss: 0.7842
Epoch 35/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8321 - val_loss: 0.7819
Epoch 36/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8302 - val_loss: 0.7799
Epoch 37/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8283 - val_loss: 0.7778
Epoch 38/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8265 - val_loss: 0.7758
Epoch 39/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8248 - val_loss: 0.7740
Epoch 40/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8231 - val_loss: 0.7721
Epoch 41/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8214 - val_loss: 0.7702
Epoch 42/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8197 - val_loss: 0.7684
Epoch 43/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8181 - val_loss: 0.7667
Epoch 44/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8165 - val_loss: 0.7650
Epoch 45/300
4/4 [==============================] - 0s 8ms/step - loss: 0.8149 - val_loss: 0.7633
Epoch 46/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8135 - val_loss: 0.7616
Epoch 47/300
4/4 [==============================] - 0s 8ms/step - loss: 0.8120 - val_loss: 0.7600
Epoch 48/300
4/4 [==============================] - 0s 8ms/step - loss: 0.8105 - val_loss: 0.7583
Epoch 49/300
4/4 [==============================] - 0s 15ms/step - loss: 0.8090 - val_loss: 0.7568
Epoch 50/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8076 - val_loss: 0.7553
Epoch 51/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8062 - val_loss: 0.7537
Epoch 52/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8048 - val_loss: 0.7523
Epoch 53/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8035 - val_loss: 0.7510
Epoch 54/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8022 - val_loss: 0.7497
Epoch 55/300
4/4 [==============================] - 0s 8ms/step - loss: 0.8009 - val_loss: 0.7483
Epoch 56/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7996 - val_loss: 0.7470
Epoch 57/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7984 - val_loss: 0.7457
Epoch 58/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7972 - val_loss: 0.7444
Epoch 59/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7960 - val_loss: 0.7431
Epoch 60/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7948 - val_loss: 0.7419
Epoch 61/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7936 - val_loss: 0.7407
Epoch 62/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7925 - val_loss: 0.7393
Epoch 63/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7914 - val_loss: 0.7381
Epoch 64/300
4/4 [==============================] - 0s 15ms/step - loss: 0.7902 - val_loss: 0.7369
Epoch 65/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7891 - val_loss: 0.7357
Epoch 66/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7880 - val_loss: 0.7345
Epoch 67/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7869 - val_loss: 0.7333
Epoch 68/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7858 - val_loss: 0.7321
Epoch 69/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7848 - val_loss: 0.7310
Epoch 70/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7837 - val_loss: 0.7300
Epoch 71/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7827 - val_loss: 0.7289
Epoch 72/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7816 - val_loss: 0.7278
Epoch 73/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7805 - val_loss: 0.7267
Epoch 74/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7795 - val_loss: 0.7257
Epoch 75/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7784 - val_loss: 0.7246
Epoch 76/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7774 - val_loss: 0.7235
Epoch 77/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7763 - val_loss: 0.7224
Epoch 78/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7753 - val_loss: 0.7214
Epoch 79/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7743 - val_loss: 0.7203
Epoch 80/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7733 - val_loss: 0.7193
Epoch 81/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7723 - val_loss: 0.7184
Epoch 82/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7713 - val_loss: 0.7175
Epoch 83/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7703 - val_loss: 0.7165
Epoch 84/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7693 - val_loss: 0.7157
Epoch 85/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7683 - val_loss: 0.7148
Epoch 86/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7673 - val_loss: 0.7139
Epoch 87/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7663 - val_loss: 0.7131
Epoch 88/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7653 - val_loss: 0.7122
Epoch 89/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7644 - val_loss: 0.7112
Epoch 90/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7634 - val_loss: 0.7104
Epoch 91/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7625 - val_loss: 0.7096
Epoch 92/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7615 - val_loss: 0.7087
Epoch 93/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7606 - val_loss: 0.7079
Epoch 94/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7597 - val_loss: 0.7070
Epoch 95/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7588 - val_loss: 0.7061
Epoch 96/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7579 - val_loss: 0.7052
Epoch 97/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7571 - val_loss: 0.7043
Epoch 98/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7562 - val_loss: 0.7034
Epoch 99/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7553 - val_loss: 0.7025
Epoch 100/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7545 - val_loss: 0.7018
Epoch 101/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7537 - val_loss: 0.7010
Epoch 102/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7528 - val_loss: 0.7001
Epoch 103/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7520 - val_loss: 0.6993
Epoch 104/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7512 - val_loss: 0.6984
Epoch 105/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7503 - val_loss: 0.6976
Epoch 106/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7496 - val_loss: 0.6968
Epoch 107/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7488 - val_loss: 0.6960
Epoch 108/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7480 - val_loss: 0.6952
Epoch 109/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7473 - val_loss: 0.6944
Epoch 110/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7465 - val_loss: 0.6936
Epoch 111/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7458 - val_loss: 0.6928
Epoch 112/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7451 - val_loss: 0.6921
Epoch 113/300
4/4 [==============================] - 0s 15ms/step - loss: 0.7444 - val_loss: 0.6915
Epoch 114/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7437 - val_loss: 0.6907
Epoch 115/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7430 - val_loss: 0.6900
Epoch 116/300
4/4 [==============================] - 0s 16ms/step - loss: 0.7423 - val_loss: 0.6894
Epoch 117/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7417 - val_loss: 0.6887
Epoch 118/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7411 - val_loss: 0.6880
Epoch 119/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7404 - val_loss: 0.6873
Epoch 120/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7398 - val_loss: 0.6867
Epoch 121/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7391 - val_loss: 0.6861
Epoch 122/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7385 - val_loss: 0.6855
Epoch 123/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7380 - val_loss: 0.6848
Epoch 124/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7374 - val_loss: 0.6842
Epoch 125/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7368 - val_loss: 0.6836
Epoch 126/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7362 - val_loss: 0.6829
Epoch 127/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7356 - val_loss: 0.6824
Epoch 128/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7351 - val_loss: 0.6818
Epoch 129/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7345 - val_loss: 0.6813
Epoch 130/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7340 - val_loss: 0.6806
Epoch 131/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7334 - val_loss: 0.6801
Epoch 132/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7329 - val_loss: 0.6796
Epoch 133/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7324 - val_loss: 0.6791
Epoch 134/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7319 - val_loss: 0.6786
Epoch 135/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7314 - val_loss: 0.6781
Epoch 136/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7309 - val_loss: 0.6777
Epoch 137/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7304 - val_loss: 0.6772
Epoch 138/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7299 - val_loss: 0.6768
Epoch 139/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7294 - val_loss: 0.6764
Epoch 140/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7289 - val_loss: 0.6759
Epoch 141/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7284 - val_loss: 0.6754
Epoch 142/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7279 - val_loss: 0.6751
Epoch 143/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7275 - val_loss: 0.6747
Epoch 144/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7271 - val_loss: 0.6743
Epoch 145/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7266 - val_loss: 0.6739
Epoch 146/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7262 - val_loss: 0.6735
Epoch 147/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7258 - val_loss: 0.6731
Epoch 148/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7254 - val_loss: 0.6726
Epoch 149/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7250 - val_loss: 0.6721
Epoch 150/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7246 - val_loss: 0.6717
Epoch 151/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7242 - val_loss: 0.6713
Epoch 152/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7238 - val_loss: 0.6710
Epoch 153/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7234 - val_loss: 0.6706
Epoch 154/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7230 - val_loss: 0.6702
Epoch 155/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7226 - val_loss: 0.6698
Epoch 156/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7222 - val_loss: 0.6695
Epoch 157/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7219 - val_loss: 0.6691
Epoch 158/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7215 - val_loss: 0.6686
Epoch 159/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7212 - val_loss: 0.6683
Epoch 160/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7208 - val_loss: 0.6679
Epoch 161/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7205 - val_loss: 0.6673
Epoch 162/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7201 - val_loss: 0.6669
Epoch 163/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7198 - val_loss: 0.6666
Epoch 164/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7195 - val_loss: 0.6662
Epoch 165/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7191 - val_loss: 0.6659
Epoch 166/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7188 - val_loss: 0.6656
Epoch 167/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7185 - val_loss: 0.6651
Epoch 168/300
4/4 [==============================] - 0s 13ms/step - loss: 0.7182 - val_loss: 0.6648
Epoch 169/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7179 - val_loss: 0.6646
Epoch 170/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7176 - val_loss: 0.6643
Epoch 171/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7173 - val_loss: 0.6640
Epoch 172/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7170 - val_loss: 0.6638
Epoch 173/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7167 - val_loss: 0.6635
Epoch 174/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7164 - val_loss: 0.6631
Epoch 175/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7161 - val_loss: 0.6627
Epoch 176/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7158 - val_loss: 0.6623
Epoch 177/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7155 - val_loss: 0.6619
Epoch 178/300
4/4 [==============================] - 0s 8ms/step - loss: 0.7152 - val_loss: 0.6615
Epoch 179/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7149 - val_loss: 0.6612
Epoch 180/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7146 - val_loss: 0.6608
Epoch 181/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7143 - val_loss: 0.6605
Epoch 182/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7140 - val_loss: 0.6602
Epoch 183/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7137 - val_loss: 0.6600
Epoch 184/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7135 - val_loss: 0.6597
Epoch 185/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7131 - val_loss: 0.6594
Epoch 186/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7128 - val_loss: 0.6591
Epoch 187/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7126 - val_loss: 0.6588
Epoch 188/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7123 - val_loss: 0.6585
Epoch 189/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7120 - val_loss: 0.6583
Epoch 190/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7117 - val_loss: 0.6579
Epoch 191/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7114 - val_loss: 0.6576
Epoch 192/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7112 - val_loss: 0.6572
Epoch 193/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7109 - val_loss: 0.6569
Epoch 194/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7106 - val_loss: 0.6565
Epoch 195/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7104 - val_loss: 0.6560
Epoch 196/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7101 - val_loss: 0.6556
Epoch 197/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7098 - val_loss: 0.6552
Epoch 198/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7095 - val_loss: 0.6548
Epoch 199/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7092 - val_loss: 0.6544
Epoch 200/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7089 - val_loss: 0.6540
Epoch 201/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7086 - val_loss: 0.6536
Epoch 202/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7083 - val_loss: 0.6532
Epoch 203/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7079 - val_loss: 0.6527
Epoch 204/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7076 - val_loss: 0.6523
Epoch 205/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7073 - val_loss: 0.6520
Epoch 206/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7070 - val_loss: 0.6516
Epoch 207/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7067 - val_loss: 0.6514
Epoch 208/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7064 - val_loss: 0.6511
Epoch 209/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7061 - val_loss: 0.6509
Epoch 210/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7058 - val_loss: 0.6506
Epoch 211/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7055 - val_loss: 0.6505
Epoch 212/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7053 - val_loss: 0.6502
Epoch 213/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7050 - val_loss: 0.6500
Epoch 214/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7047 - val_loss: 0.6497
Epoch 215/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7045 - val_loss: 0.6494
Epoch 216/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7042 - val_loss: 0.6492
Epoch 217/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7040 - val_loss: 0.6489
Epoch 218/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7037 - val_loss: 0.6487
Epoch 219/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7035 - val_loss: 0.6485
Epoch 220/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7032 - val_loss: 0.6483
Epoch 221/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7030 - val_loss: 0.6479
Epoch 222/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7027 - val_loss: 0.6477
Epoch 223/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7025 - val_loss: 0.6475
Epoch 224/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7022 - val_loss: 0.6472
Epoch 225/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7020 - val_loss: 0.6469
Epoch 226/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7017 - val_loss: 0.6465
Epoch 227/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7015 - val_loss: 0.6461
Epoch 228/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7012 - val_loss: 0.6459
Epoch 229/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7010 - val_loss: 0.6456
Epoch 230/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7007 - val_loss: 0.6453
Epoch 231/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7004 - val_loss: 0.6450
Epoch 232/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7002 - val_loss: 0.6448
Epoch 233/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6999 - val_loss: 0.6446
Epoch 234/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6997 - val_loss: 0.6444
Epoch 235/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6995 - val_loss: 0.6441
Epoch 236/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6992 - val_loss: 0.6438
Epoch 237/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6990 - val_loss: 0.6435
Epoch 238/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6987 - val_loss: 0.6433
Epoch 239/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6985 - val_loss: 0.6430
Epoch 240/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6982 - val_loss: 0.6427
Epoch 241/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6979 - val_loss: 0.6424
Epoch 242/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6977 - val_loss: 0.6420
Epoch 243/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6975 - val_loss: 0.6418
Epoch 244/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6972 - val_loss: 0.6415
Epoch 245/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6970 - val_loss: 0.6412
Epoch 246/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6967 - val_loss: 0.6410
Epoch 247/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6964 - val_loss: 0.6406
Epoch 248/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6962 - val_loss: 0.6404
Epoch 249/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6959 - val_loss: 0.6401
Epoch 250/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6957 - val_loss: 0.6399
Epoch 251/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6954 - val_loss: 0.6395
Epoch 252/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6951 - val_loss: 0.6393
Epoch 253/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6949 - val_loss: 0.6390
Epoch 254/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6946 - val_loss: 0.6388
Epoch 255/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6944 - val_loss: 0.6385
Epoch 256/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6942 - val_loss: 0.6384
Epoch 257/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6940 - val_loss: 0.6382
Epoch 258/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6937 - val_loss: 0.6378
Epoch 259/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6934 - val_loss: 0.6374
Epoch 260/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6932 - val_loss: 0.6372
Epoch 261/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6930 - val_loss: 0.6368
Epoch 262/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6927 - val_loss: 0.6365
Epoch 263/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6925 - val_loss: 0.6362
Epoch 264/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6922 - val_loss: 0.6358
Epoch 265/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6920 - val_loss: 0.6356
Epoch 266/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6918 - val_loss: 0.6353
Epoch 267/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6915 - val_loss: 0.6351
Epoch 268/300
4/4 [==============================] - 0s 8ms/step - loss: 0.6913 - val_loss: 0.6347
Epoch 269/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6910 - val_loss: 0.6343
Epoch 270/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6908 - val_loss: 0.6341
Epoch 271/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6905 - val_loss: 0.6337
Epoch 272/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6902 - val_loss: 0.6334
Epoch 273/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6900 - val_loss: 0.6332
Epoch 274/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6898 - val_loss: 0.6328
Epoch 275/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6895 - val_loss: 0.6326
Epoch 276/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6892 - val_loss: 0.6324
Epoch 277/300
4/4 [==============================] - 0s 16ms/step - loss: 0.6890 - val_loss: 0.6321
Epoch 278/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6887 - val_loss: 0.6317
Epoch 279/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6885 - val_loss: 0.6314
Epoch 280/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6882 - val_loss: 0.6311
Epoch 281/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6880 - val_loss: 0.6306
Epoch 282/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6877 - val_loss: 0.6303
Epoch 283/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6874 - val_loss: 0.6301
Epoch 284/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6872 - val_loss: 0.6297
Epoch 285/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6869 - val_loss: 0.6295
Epoch 286/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6867 - val_loss: 0.6291
Epoch 287/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6864 - val_loss: 0.6289
Epoch 288/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6861 - val_loss: 0.6286
Epoch 289/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6859 - val_loss: 0.6282
Epoch 290/300
4/4 [==============================] - 0s 15ms/step - loss: 0.6856 - val_loss: 0.6278
Epoch 291/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6853 - val_loss: 0.6273
Epoch 292/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6850 - val_loss: 0.6271
Epoch 293/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6847 - val_loss: 0.6265
Epoch 294/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6844 - val_loss: 0.6261
Epoch 295/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6841 - val_loss: 0.6258
Epoch 296/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6838 - val_loss: 0.6253
Epoch 297/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6835 - val_loss: 0.6248
Epoch 298/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6832 - val_loss: 0.6245
Epoch 299/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6830 - val_loss: 0.6241
Epoch 300/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6827 - val_loss: 0.6235
Epoch 1/300
4/4 [==============================] - 0s 17ms/step - loss: 1.0752 - val_loss: 1.0588
Epoch 2/300
4/4 [==============================] - 0s 9ms/step - loss: 1.0625 - val_loss: 1.0504
Epoch 3/300
4/4 [==============================] - 0s 10ms/step - loss: 1.0541 - val_loss: 1.0436
Epoch 4/300
4/4 [==============================] - 0s 9ms/step - loss: 1.0472 - val_loss: 1.0379
Epoch 5/300
4/4 [==============================] - 0s 9ms/step - loss: 1.0413 - val_loss: 1.0328
Epoch 6/300
4/4 [==============================] - 0s 9ms/step - loss: 1.0362 - val_loss: 1.0283
Epoch 7/300
4/4 [==============================] - 0s 9ms/step - loss: 1.0315 - val_loss: 1.0240
Epoch 8/300
4/4 [==============================] - 0s 9ms/step - loss: 1.0272 - val_loss: 1.0203
Epoch 9/300
4/4 [==============================] - 0s 9ms/step - loss: 1.0231 - val_loss: 1.0165
Epoch 10/300
4/4 [==============================] - 0s 10ms/step - loss: 1.0191 - val_loss: 1.0132
Epoch 11/300
4/4 [==============================] - 0s 11ms/step - loss: 1.0154 - val_loss: 1.0097
Epoch 12/300
4/4 [==============================] - 0s 9ms/step - loss: 1.0117 - val_loss: 1.0065
Epoch 13/300
4/4 [==============================] - 0s 10ms/step - loss: 1.0082 - val_loss: 1.0033
Epoch 14/300
4/4 [==============================] - 0s 9ms/step - loss: 1.0049 - val_loss: 0.9999
Epoch 15/300
4/4 [==============================] - 0s 9ms/step - loss: 1.0015 - val_loss: 0.9972
Epoch 16/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9983 - val_loss: 0.9941
Epoch 17/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9951 - val_loss: 0.9914
Epoch 18/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9921 - val_loss: 0.9883
Epoch 19/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9891 - val_loss: 0.9855
Epoch 20/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9861 - val_loss: 0.9823
Epoch 21/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9831 - val_loss: 0.9797
Epoch 22/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9803 - val_loss: 0.9767
Epoch 23/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9774 - val_loss: 0.9738
Epoch 24/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9745 - val_loss: 0.9708
Epoch 25/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9715 - val_loss: 0.9680
Epoch 26/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9686 - val_loss: 0.9651
Epoch 27/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9656 - val_loss: 0.9622
Epoch 28/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9627 - val_loss: 0.9592
Epoch 29/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9597 - val_loss: 0.9567
Epoch 30/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9567 - val_loss: 0.9535
Epoch 31/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9537 - val_loss: 0.9504
Epoch 32/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9507 - val_loss: 0.9474
Epoch 33/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9477 - val_loss: 0.9442
Epoch 34/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9447 - val_loss: 0.9414
Epoch 35/300
4/4 [==============================] - 0s 8ms/step - loss: 0.9416 - val_loss: 0.9384
Epoch 36/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9386 - val_loss: 0.9353
Epoch 37/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9354 - val_loss: 0.9327
Epoch 38/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9324 - val_loss: 0.9294
Epoch 39/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9292 - val_loss: 0.9264
Epoch 40/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9262 - val_loss: 0.9232
Epoch 41/300
4/4 [==============================] - 0s 15ms/step - loss: 0.9230 - val_loss: 0.9203
Epoch 42/300
4/4 [==============================] - 0s 14ms/step - loss: 0.9199 - val_loss: 0.9169
Epoch 43/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9167 - val_loss: 0.9138
Epoch 44/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9135 - val_loss: 0.9104
Epoch 45/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9103 - val_loss: 0.9075
Epoch 46/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9071 - val_loss: 0.9044
Epoch 47/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9039 - val_loss: 0.9011
Epoch 48/300
4/4 [==============================] - 0s 9ms/step - loss: 0.9006 - val_loss: 0.8979
Epoch 49/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8973 - val_loss: 0.8944
Epoch 50/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8939 - val_loss: 0.8909
Epoch 51/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8906 - val_loss: 0.8877
Epoch 52/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8873 - val_loss: 0.8844
Epoch 53/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8840 - val_loss: 0.8811
Epoch 54/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8806 - val_loss: 0.8779
Epoch 55/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8772 - val_loss: 0.8745
Epoch 56/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8738 - val_loss: 0.8713
Epoch 57/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8705 - val_loss: 0.8679
Epoch 58/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8670 - val_loss: 0.8643
Epoch 59/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8635 - val_loss: 0.8610
Epoch 60/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8602 - val_loss: 0.8574
Epoch 61/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8566 - val_loss: 0.8543
Epoch 62/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8532 - val_loss: 0.8512
Epoch 63/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8496 - val_loss: 0.8475
Epoch 64/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8461 - val_loss: 0.8441
Epoch 65/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8425 - val_loss: 0.8408
Epoch 66/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8390 - val_loss: 0.8374
Epoch 67/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8355 - val_loss: 0.8340
Epoch 68/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8319 - val_loss: 0.8309
Epoch 69/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8284 - val_loss: 0.8271
Epoch 70/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8249 - val_loss: 0.8240
Epoch 71/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8215 - val_loss: 0.8208
Epoch 72/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8181 - val_loss: 0.8172
Epoch 73/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8146 - val_loss: 0.8134
Epoch 74/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8110 - val_loss: 0.8102
Epoch 75/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8076 - val_loss: 0.8067
Epoch 76/300
4/4 [==============================] - 0s 16ms/step - loss: 0.8041 - val_loss: 0.8031
Epoch 77/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8007 - val_loss: 0.8001
Epoch 78/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7974 - val_loss: 0.7967
Epoch 79/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7940 - val_loss: 0.7933
Epoch 80/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7905 - val_loss: 0.7898
Epoch 81/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7870 - val_loss: 0.7860
Epoch 82/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7835 - val_loss: 0.7832
Epoch 83/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7802 - val_loss: 0.7794
Epoch 84/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7768 - val_loss: 0.7757
Epoch 85/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7734 - val_loss: 0.7727
Epoch 86/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7702 - val_loss: 0.7694
Epoch 87/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7668 - val_loss: 0.7660
Epoch 88/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7636 - val_loss: 0.7631
Epoch 89/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7604 - val_loss: 0.7596
Epoch 90/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7572 - val_loss: 0.7557
Epoch 91/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7539 - val_loss: 0.7531
Epoch 92/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7509 - val_loss: 0.7496
Epoch 93/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7479 - val_loss: 0.7467
Epoch 94/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7449 - val_loss: 0.7440
Epoch 95/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7419 - val_loss: 0.7413
Epoch 96/300
4/4 [==============================] - 0s 13ms/step - loss: 0.7390 - val_loss: 0.7381
Epoch 97/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7361 - val_loss: 0.7350
Epoch 98/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7332 - val_loss: 0.7324
Epoch 99/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7305 - val_loss: 0.7300
Epoch 100/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7279 - val_loss: 0.7273
Epoch 101/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7254 - val_loss: 0.7244
Epoch 102/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7228 - val_loss: 0.7221
Epoch 103/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7203 - val_loss: 0.7199
Epoch 104/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7178 - val_loss: 0.7176
Epoch 105/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7154 - val_loss: 0.7155
Epoch 106/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7130 - val_loss: 0.7133
Epoch 107/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7107 - val_loss: 0.7111
Epoch 108/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7084 - val_loss: 0.7092
Epoch 109/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7061 - val_loss: 0.7076
Epoch 110/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7039 - val_loss: 0.7058
Epoch 111/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7018 - val_loss: 0.7036
Epoch 112/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6998 - val_loss: 0.7028
Epoch 113/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6980 - val_loss: 0.7013
Epoch 114/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6962 - val_loss: 0.7002
Epoch 115/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6945 - val_loss: 0.6993
Epoch 116/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6929 - val_loss: 0.6983
Epoch 117/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6913 - val_loss: 0.6970
Epoch 118/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6899 - val_loss: 0.6962
Epoch 119/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6885 - val_loss: 0.6952
Epoch 120/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6872 - val_loss: 0.6944
Epoch 121/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6859 - val_loss: 0.6935
Epoch 122/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6845 - val_loss: 0.6925
Epoch 123/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6833 - val_loss: 0.6919
Epoch 124/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6822 - val_loss: 0.6915
Epoch 125/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6812 - val_loss: 0.6906
Epoch 126/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6801 - val_loss: 0.6900
Epoch 127/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6792 - val_loss: 0.6895
Epoch 128/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6784 - val_loss: 0.6894
Epoch 129/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6775 - val_loss: 0.6891
Epoch 130/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6767 - val_loss: 0.6886
Epoch 131/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6760 - val_loss: 0.6886
Epoch 132/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6754 - val_loss: 0.6880
Epoch 133/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6748 - val_loss: 0.6877
Epoch 134/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6741 - val_loss: 0.6879
Epoch 135/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6737 - val_loss: 0.6876
Epoch 136/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6730 - val_loss: 0.6874
Epoch 137/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6725 - val_loss: 0.6867
Epoch 138/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6720 - val_loss: 0.6869
Epoch 139/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6716 - val_loss: 0.6863
Epoch 140/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6711 - val_loss: 0.6858
Epoch 141/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6707 - val_loss: 0.6856
Epoch 142/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6702 - val_loss: 0.6846
Epoch 143/300
4/4 [==============================] - 0s 16ms/step - loss: 0.6698 - val_loss: 0.6849
Epoch 144/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6694 - val_loss: 0.6845
Epoch 145/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6689 - val_loss: 0.6839
Epoch 146/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6684 - val_loss: 0.6837
Epoch 147/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6683 - val_loss: 0.6831
Epoch 148/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6677 - val_loss: 0.6829
Epoch 149/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6672 - val_loss: 0.6823
Epoch 150/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6669 - val_loss: 0.6817
Epoch 151/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6665 - val_loss: 0.6811
Epoch 152/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6661 - val_loss: 0.6810
Epoch 153/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6657 - val_loss: 0.6808
Epoch 154/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6652 - val_loss: 0.6803
Epoch 155/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6648 - val_loss: 0.6803
Epoch 156/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6646 - val_loss: 0.6797
Epoch 157/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6641 - val_loss: 0.6792
Epoch 158/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6638 - val_loss: 0.6793
Epoch 159/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6633 - val_loss: 0.6788
Epoch 160/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6630 - val_loss: 0.6784
Epoch 161/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6627 - val_loss: 0.6779
Epoch 162/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6624 - val_loss: 0.6774
Epoch 163/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6620 - val_loss: 0.6770
Epoch 164/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6616 - val_loss: 0.6767
Epoch 165/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6614 - val_loss: 0.6764
Epoch 166/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6610 - val_loss: 0.6763
Epoch 167/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6606 - val_loss: 0.6758
Epoch 168/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6603 - val_loss: 0.6749
Epoch 169/300
4/4 [==============================] - 0s 16ms/step - loss: 0.6599 - val_loss: 0.6747
Epoch 170/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6597 - val_loss: 0.6747
Epoch 171/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6594 - val_loss: 0.6738
Epoch 172/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6590 - val_loss: 0.6738
Epoch 173/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6588 - val_loss: 0.6735
Epoch 174/300
4/4 [==============================] - 0s 16ms/step - loss: 0.6585 - val_loss: 0.6729
Epoch 175/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6581 - val_loss: 0.6731
Epoch 176/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6578 - val_loss: 0.6728
Epoch 177/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6575 - val_loss: 0.6723
Epoch 178/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6572 - val_loss: 0.6721
Epoch 179/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6569 - val_loss: 0.6720
Epoch 180/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6567 - val_loss: 0.6714
Epoch 181/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6564 - val_loss: 0.6711
Epoch 182/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6561 - val_loss: 0.6708
Epoch 183/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6557 - val_loss: 0.6706
Epoch 184/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6555 - val_loss: 0.6705
Epoch 185/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6553 - val_loss: 0.6701
Epoch 186/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6551 - val_loss: 0.6692
Epoch 187/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6547 - val_loss: 0.6691
Epoch 188/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6544 - val_loss: 0.6687
Epoch 189/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6542 - val_loss: 0.6686
Epoch 190/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6540 - val_loss: 0.6679
Epoch 191/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6537 - val_loss: 0.6674
Epoch 192/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6535 - val_loss: 0.6675
Epoch 193/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6532 - val_loss: 0.6672
Epoch 194/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6530 - val_loss: 0.6670
Epoch 195/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6527 - val_loss: 0.6665
Epoch 196/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6525 - val_loss: 0.6666
Epoch 197/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6522 - val_loss: 0.6662
Epoch 198/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6520 - val_loss: 0.6661
Epoch 199/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6518 - val_loss: 0.6655
Epoch 200/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6515 - val_loss: 0.6649
Epoch 201/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6512 - val_loss: 0.6653
Epoch 202/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6510 - val_loss: 0.6651
Epoch 203/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6507 - val_loss: 0.6645
Epoch 204/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6506 - val_loss: 0.6644
Epoch 205/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6503 - val_loss: 0.6638
Epoch 206/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6501 - val_loss: 0.6640
Epoch 207/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6499 - val_loss: 0.6635
Epoch 208/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6496 - val_loss: 0.6635
Epoch 209/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6495 - val_loss: 0.6631
Epoch 210/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6492 - val_loss: 0.6627
Epoch 211/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6491 - val_loss: 0.6628
Epoch 212/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6489 - val_loss: 0.6627
Epoch 213/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6486 - val_loss: 0.6623
Epoch 214/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6486 - val_loss: 0.6619
Epoch 215/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6483 - val_loss: 0.6617
Epoch 216/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6481 - val_loss: 0.6616
Epoch 217/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6480 - val_loss: 0.6616
Epoch 218/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6477 - val_loss: 0.6613
Epoch 219/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6475 - val_loss: 0.6614
Epoch 220/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6473 - val_loss: 0.6610
Epoch 221/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6471 - val_loss: 0.6607
Epoch 222/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6469 - val_loss: 0.6604
Epoch 223/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6468 - val_loss: 0.6605
Epoch 224/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6467 - val_loss: 0.6602
Epoch 225/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6465 - val_loss: 0.6597
Epoch 226/300
4/4 [==============================] - 0s 16ms/step - loss: 0.6462 - val_loss: 0.6593
Epoch 227/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6460 - val_loss: 0.6593
Epoch 228/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6459 - val_loss: 0.6594
Epoch 229/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6458 - val_loss: 0.6593
Epoch 230/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6455 - val_loss: 0.6591
Epoch 231/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6454 - val_loss: 0.6590
Epoch 232/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6452 - val_loss: 0.6590
Epoch 233/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6451 - val_loss: 0.6585
Epoch 234/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6450 - val_loss: 0.6588
Epoch 235/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6448 - val_loss: 0.6583
Epoch 236/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6447 - val_loss: 0.6582
Epoch 237/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6446 - val_loss: 0.6582
Epoch 238/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6444 - val_loss: 0.6579
Epoch 239/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6443 - val_loss: 0.6570
Epoch 240/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6440 - val_loss: 0.6573
Epoch 241/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6440 - val_loss: 0.6571
Epoch 242/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6438 - val_loss: 0.6569
Epoch 243/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6437 - val_loss: 0.6567
Epoch 244/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6435 - val_loss: 0.6565
Epoch 245/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6435 - val_loss: 0.6562
Epoch 246/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6433 - val_loss: 0.6564
Epoch 247/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6431 - val_loss: 0.6562
Epoch 248/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6431 - val_loss: 0.6560
Epoch 249/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6429 - val_loss: 0.6560
Epoch 250/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6428 - val_loss: 0.6556
Epoch 251/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6426 - val_loss: 0.6556
Epoch 252/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6425 - val_loss: 0.6555
Epoch 253/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6424 - val_loss: 0.6555
Epoch 254/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6423 - val_loss: 0.6552
Epoch 255/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6422 - val_loss: 0.6549
Epoch 256/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6421 - val_loss: 0.6548
Epoch 257/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6419 - val_loss: 0.6548
Epoch 258/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6417 - val_loss: 0.6546
Epoch 259/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6417 - val_loss: 0.6545
Epoch 260/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6416 - val_loss: 0.6547
Epoch 261/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6416 - val_loss: 0.6545
Epoch 262/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6414 - val_loss: 0.6541
Epoch 263/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6413 - val_loss: 0.6541
Epoch 264/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6412 - val_loss: 0.6533
Epoch 265/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6411 - val_loss: 0.6535
Epoch 266/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6410 - val_loss: 0.6531
Epoch 267/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6408 - val_loss: 0.6535
Epoch 268/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6408 - val_loss: 0.6532
Epoch 269/300
4/4 [==============================] - 0s 19ms/step - loss: 0.6406 - val_loss: 0.6529
Epoch 270/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6406 - val_loss: 0.6528
Epoch 271/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6404 - val_loss: 0.6524
Epoch 272/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6404 - val_loss: 0.6524
Epoch 273/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6403 - val_loss: 0.6524
Epoch 274/300
4/4 [==============================] - 0s 16ms/step - loss: 0.6402 - val_loss: 0.6522
Epoch 275/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6401 - val_loss: 0.6521
Epoch 276/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6400 - val_loss: 0.6522
Epoch 277/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6399 - val_loss: 0.6517
Epoch 278/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6398 - val_loss: 0.6519
Epoch 279/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6397 - val_loss: 0.6514
Epoch 280/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6396 - val_loss: 0.6517
Epoch 281/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6397 - val_loss: 0.6511
Epoch 282/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6394 - val_loss: 0.6516
Epoch 283/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6394 - val_loss: 0.6511
Epoch 284/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6393 - val_loss: 0.6511
Epoch 285/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6392 - val_loss: 0.6508
Epoch 286/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6391 - val_loss: 0.6506
Epoch 287/300
4/4 [==============================] - 0s 17ms/step - loss: 0.6390 - val_loss: 0.6506
Epoch 288/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6390 - val_loss: 0.6502
Epoch 289/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6388 - val_loss: 0.6505
Epoch 290/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6388 - val_loss: 0.6504
Epoch 291/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6387 - val_loss: 0.6501
Epoch 292/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6386 - val_loss: 0.6503
Epoch 293/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6386 - val_loss: 0.6502
Epoch 294/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6385 - val_loss: 0.6499
Epoch 295/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6384 - val_loss: 0.6500
Epoch 296/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6384 - val_loss: 0.6496
Epoch 297/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6383 - val_loss: 0.6496
Epoch 298/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6382 - val_loss: 0.6494
Epoch 299/300
4/4 [==============================] - 0s 16ms/step - loss: 0.6381 - val_loss: 0.6493
Epoch 300/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6380 - val_loss: 0.6493
Epoch 1/300
4/4 [==============================] - 0s 19ms/step - loss: 1.0077 - val_loss: 1.0035
Epoch 2/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9963 - val_loss: 0.9958
Epoch 3/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9880 - val_loss: 0.9889
Epoch 4/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9808 - val_loss: 0.9826
Epoch 5/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9743 - val_loss: 0.9765
Epoch 6/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9682 - val_loss: 0.9707
Epoch 7/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9626 - val_loss: 0.9653
Epoch 8/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9573 - val_loss: 0.9600
Epoch 9/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9524 - val_loss: 0.9551
Epoch 10/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9476 - val_loss: 0.9504
Epoch 11/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9429 - val_loss: 0.9458
Epoch 12/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9383 - val_loss: 0.9414
Epoch 13/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9339 - val_loss: 0.9374
Epoch 14/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9297 - val_loss: 0.9336
Epoch 15/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9257 - val_loss: 0.9298
Epoch 16/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9217 - val_loss: 0.9261
Epoch 17/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9177 - val_loss: 0.9224
Epoch 18/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9137 - val_loss: 0.9186
Epoch 19/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9098 - val_loss: 0.9147
Epoch 20/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9058 - val_loss: 0.9110
Epoch 21/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9019 - val_loss: 0.9072
Epoch 22/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8980 - val_loss: 0.9035
Epoch 23/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8941 - val_loss: 0.8997
Epoch 24/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8902 - val_loss: 0.8961
Epoch 25/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8865 - val_loss: 0.8926
Epoch 26/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8826 - val_loss: 0.8890
Epoch 27/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8789 - val_loss: 0.8854
Epoch 28/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8752 - val_loss: 0.8819
Epoch 29/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8715 - val_loss: 0.8787
Epoch 30/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8679 - val_loss: 0.8753
Epoch 31/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8643 - val_loss: 0.8716
Epoch 32/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8608 - val_loss: 0.8683
Epoch 33/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8573 - val_loss: 0.8651
Epoch 34/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8540 - val_loss: 0.8619
Epoch 35/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8506 - val_loss: 0.8587
Epoch 36/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8472 - val_loss: 0.8554
Epoch 37/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8440 - val_loss: 0.8524
Epoch 38/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8408 - val_loss: 0.8491
Epoch 39/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8376 - val_loss: 0.8461
Epoch 40/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8345 - val_loss: 0.8430
Epoch 41/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8314 - val_loss: 0.8398
Epoch 42/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8284 - val_loss: 0.8371
Epoch 43/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8254 - val_loss: 0.8340
Epoch 44/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8224 - val_loss: 0.8312
Epoch 45/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8193 - val_loss: 0.8281
Epoch 46/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8165 - val_loss: 0.8253
Epoch 47/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8137 - val_loss: 0.8224
Epoch 48/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8109 - val_loss: 0.8194
Epoch 49/300
4/4 [==============================] - 0s 9ms/step - loss: 0.8081 - val_loss: 0.8166
Epoch 50/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8054 - val_loss: 0.8138
Epoch 51/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8027 - val_loss: 0.8112
Epoch 52/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8000 - val_loss: 0.8085
Epoch 53/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7974 - val_loss: 0.8060
Epoch 54/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7947 - val_loss: 0.8032
Epoch 55/300
4/4 [==============================] - 0s 13ms/step - loss: 0.7921 - val_loss: 0.8004
Epoch 56/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7896 - val_loss: 0.7976
Epoch 57/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7870 - val_loss: 0.7949
Epoch 58/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7844 - val_loss: 0.7924
Epoch 59/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7819 - val_loss: 0.7900
Epoch 60/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7795 - val_loss: 0.7874
Epoch 61/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7771 - val_loss: 0.7849
Epoch 62/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7747 - val_loss: 0.7826
Epoch 63/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7723 - val_loss: 0.7800
Epoch 64/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7700 - val_loss: 0.7777
Epoch 65/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7676 - val_loss: 0.7752
Epoch 66/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7654 - val_loss: 0.7728
Epoch 67/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7632 - val_loss: 0.7703
Epoch 68/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7610 - val_loss: 0.7680
Epoch 69/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7589 - val_loss: 0.7657
Epoch 70/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7567 - val_loss: 0.7632
Epoch 71/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7546 - val_loss: 0.7610
Epoch 72/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7524 - val_loss: 0.7587
Epoch 73/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7504 - val_loss: 0.7563
Epoch 74/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7482 - val_loss: 0.7542
Epoch 75/300
4/4 [==============================] - 0s 13ms/step - loss: 0.7461 - val_loss: 0.7520
Epoch 76/300
4/4 [==============================] - 0s 15ms/step - loss: 0.7441 - val_loss: 0.7496
Epoch 77/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7419 - val_loss: 0.7473
Epoch 78/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7399 - val_loss: 0.7451
Epoch 79/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7379 - val_loss: 0.7428
Epoch 80/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7358 - val_loss: 0.7407
Epoch 81/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7339 - val_loss: 0.7385
Epoch 82/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7318 - val_loss: 0.7365
Epoch 83/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7298 - val_loss: 0.7345
Epoch 84/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7279 - val_loss: 0.7324
Epoch 85/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7260 - val_loss: 0.7306
Epoch 86/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7240 - val_loss: 0.7286
Epoch 87/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7222 - val_loss: 0.7266
Epoch 88/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7203 - val_loss: 0.7247
Epoch 89/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7183 - val_loss: 0.7226
Epoch 90/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7164 - val_loss: 0.7207
Epoch 91/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7146 - val_loss: 0.7189
Epoch 92/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7127 - val_loss: 0.7169
Epoch 93/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7108 - val_loss: 0.7151
Epoch 94/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7089 - val_loss: 0.7132
Epoch 95/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7071 - val_loss: 0.7111
Epoch 96/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7053 - val_loss: 0.7092
Epoch 97/300
4/4 [==============================] - 0s 9ms/step - loss: 0.7036 - val_loss: 0.7074
Epoch 98/300
4/4 [==============================] - 0s 13ms/step - loss: 0.7018 - val_loss: 0.7056
Epoch 99/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7001 - val_loss: 0.7038
Epoch 100/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6984 - val_loss: 0.7020
Epoch 101/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6968 - val_loss: 0.7002
Epoch 102/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6951 - val_loss: 0.6984
Epoch 103/300
4/4 [==============================] - 0s 17ms/step - loss: 0.6935 - val_loss: 0.6966
Epoch 104/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6919 - val_loss: 0.6949
Epoch 105/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6903 - val_loss: 0.6931
Epoch 106/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6887 - val_loss: 0.6912
Epoch 107/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6871 - val_loss: 0.6897
Epoch 108/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6856 - val_loss: 0.6877
Epoch 109/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6841 - val_loss: 0.6860
Epoch 110/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6825 - val_loss: 0.6847
Epoch 111/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6811 - val_loss: 0.6831
Epoch 112/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6796 - val_loss: 0.6812
Epoch 113/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6782 - val_loss: 0.6798
Epoch 114/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6768 - val_loss: 0.6784
Epoch 115/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6754 - val_loss: 0.6768
Epoch 116/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6740 - val_loss: 0.6753
Epoch 117/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6728 - val_loss: 0.6737
Epoch 118/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6713 - val_loss: 0.6722
Epoch 119/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6700 - val_loss: 0.6708
Epoch 120/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6688 - val_loss: 0.6695
Epoch 121/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6675 - val_loss: 0.6684
Epoch 122/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6666 - val_loss: 0.6664
Epoch 123/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6652 - val_loss: 0.6650
Epoch 124/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6641 - val_loss: 0.6640
Epoch 125/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6631 - val_loss: 0.6623
Epoch 126/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6619 - val_loss: 0.6606
Epoch 127/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6609 - val_loss: 0.6599
Epoch 128/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6598 - val_loss: 0.6582
Epoch 129/300
4/4 [==============================] - 0s 18ms/step - loss: 0.6589 - val_loss: 0.6571
Epoch 130/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6577 - val_loss: 0.6556
Epoch 131/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6566 - val_loss: 0.6545
Epoch 132/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6559 - val_loss: 0.6532
Epoch 133/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6547 - val_loss: 0.6523
Epoch 134/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6541 - val_loss: 0.6516
Epoch 135/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6535 - val_loss: 0.6503
Epoch 136/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6525 - val_loss: 0.6493
Epoch 137/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6518 - val_loss: 0.6481
Epoch 138/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6508 - val_loss: 0.6470
Epoch 139/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6501 - val_loss: 0.6470
Epoch 140/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6497 - val_loss: 0.6457
Epoch 141/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6488 - val_loss: 0.6447
Epoch 142/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6479 - val_loss: 0.6437
Epoch 143/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6473 - val_loss: 0.6429
Epoch 144/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6466 - val_loss: 0.6418
Epoch 145/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6459 - val_loss: 0.6410
Epoch 146/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6453 - val_loss: 0.6401
Epoch 147/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6446 - val_loss: 0.6391
Epoch 148/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6439 - val_loss: 0.6385
Epoch 149/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6435 - val_loss: 0.6378
Epoch 150/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6428 - val_loss: 0.6368
Epoch 151/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6420 - val_loss: 0.6365
Epoch 152/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6416 - val_loss: 0.6355
Epoch 153/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6411 - val_loss: 0.6348
Epoch 154/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6404 - val_loss: 0.6342
Epoch 155/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6399 - val_loss: 0.6335
Epoch 156/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6394 - val_loss: 0.6329
Epoch 157/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6388 - val_loss: 0.6323
Epoch 158/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6383 - val_loss: 0.6315
Epoch 159/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6377 - val_loss: 0.6308
Epoch 160/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6370 - val_loss: 0.6306
Epoch 161/300
4/4 [==============================] - 0s 17ms/step - loss: 0.6366 - val_loss: 0.6296
Epoch 162/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6359 - val_loss: 0.6291
Epoch 163/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6354 - val_loss: 0.6281
Epoch 164/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6347 - val_loss: 0.6275
Epoch 165/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6343 - val_loss: 0.6276
Epoch 166/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6338 - val_loss: 0.6268
Epoch 167/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6331 - val_loss: 0.6259
Epoch 168/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6325 - val_loss: 0.6253
Epoch 169/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6319 - val_loss: 0.6246
Epoch 170/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6315 - val_loss: 0.6239
Epoch 171/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6308 - val_loss: 0.6236
Epoch 172/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6303 - val_loss: 0.6227
Epoch 173/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6297 - val_loss: 0.6221
Epoch 174/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6291 - val_loss: 0.6217
Epoch 175/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6286 - val_loss: 0.6209
Epoch 176/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6281 - val_loss: 0.6198
Epoch 177/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6275 - val_loss: 0.6194
Epoch 178/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6269 - val_loss: 0.6187
Epoch 179/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6263 - val_loss: 0.6181
Epoch 180/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6258 - val_loss: 0.6178
Epoch 181/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6254 - val_loss: 0.6169
Epoch 182/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6248 - val_loss: 0.6166
Epoch 183/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6241 - val_loss: 0.6160
Epoch 184/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6237 - val_loss: 0.6153
Epoch 185/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6230 - val_loss: 0.6152
Epoch 186/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6227 - val_loss: 0.6143
Epoch 187/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6221 - val_loss: 0.6137
Epoch 188/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6215 - val_loss: 0.6130
Epoch 189/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6210 - val_loss: 0.6125
Epoch 190/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6204 - val_loss: 0.6122
Epoch 191/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6199 - val_loss: 0.6117
Epoch 192/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6194 - val_loss: 0.6113
Epoch 193/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6189 - val_loss: 0.6105
Epoch 194/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6185 - val_loss: 0.6102
Epoch 195/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6178 - val_loss: 0.6096
Epoch 196/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6173 - val_loss: 0.6089
Epoch 197/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6168 - val_loss: 0.6088
Epoch 198/300
4/4 [==============================] - 0s 16ms/step - loss: 0.6163 - val_loss: 0.6079
Epoch 199/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6161 - val_loss: 0.6073
Epoch 200/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6155 - val_loss: 0.6068
Epoch 201/300
4/4 [==============================] - 0s 16ms/step - loss: 0.6151 - val_loss: 0.6067
Epoch 202/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6148 - val_loss: 0.6063
Epoch 203/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6145 - val_loss: 0.6060
Epoch 204/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6141 - val_loss: 0.6057
Epoch 205/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6139 - val_loss: 0.6052
Epoch 206/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6135 - val_loss: 0.6049
Epoch 207/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6135 - val_loss: 0.6046
Epoch 208/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6131 - val_loss: 0.6040
Epoch 209/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6128 - val_loss: 0.6037
Epoch 210/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6126 - val_loss: 0.6036
Epoch 211/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6125 - val_loss: 0.6032
Epoch 212/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6124 - val_loss: 0.6032
Epoch 213/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6120 - val_loss: 0.6030
Epoch 214/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6122 - val_loss: 0.6027
Epoch 215/300
4/4 [==============================] - 0s 9ms/step - loss: 0.6119 - val_loss: 0.6026
Epoch 216/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6115 - val_loss: 0.6023
Epoch 217/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6117 - val_loss: 0.6026
Epoch 218/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6116 - val_loss: 0.6020
Epoch 219/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6112 - val_loss: 0.6024
Epoch 220/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6113 - val_loss: 0.6019
Epoch 221/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6111 - val_loss: 0.6022
Epoch 222/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6110 - val_loss: 0.6019
Epoch 223/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6110 - val_loss: 0.6016
Epoch 224/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6106 - val_loss: 0.6019
Epoch 225/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6109 - val_loss: 0.6011
Epoch 226/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6107 - val_loss: 0.6013
Epoch 227/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6107 - val_loss: 0.6008
Epoch 228/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6105 - val_loss: 0.6010
Epoch 229/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6106 - val_loss: 0.6008
Epoch 230/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6105 - val_loss: 0.6009
Epoch 231/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6103 - val_loss: 0.6005
Epoch 232/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6102 - val_loss: 0.6005
Epoch 233/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6102 - val_loss: 0.6006
Epoch 234/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6102 - val_loss: 0.6007
Epoch 235/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6102 - val_loss: 0.6003
Epoch 236/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6100 - val_loss: 0.6002
Epoch 237/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6100 - val_loss: 0.6000
Epoch 238/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6100 - val_loss: 0.6000
Epoch 239/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6099 - val_loss: 0.5999
Epoch 240/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6099 - val_loss: 0.5999
Epoch 241/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6098 - val_loss: 0.5999
Epoch 242/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6098 - val_loss: 0.5996
Epoch 243/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6098 - val_loss: 0.5998
Epoch 244/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6097 - val_loss: 0.5994
Epoch 245/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6096 - val_loss: 0.5997
Epoch 246/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6096 - val_loss: 0.5995
Epoch 247/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6095 - val_loss: 0.5992
Epoch 248/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6095 - val_loss: 0.5995
Epoch 249/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6094 - val_loss: 0.5997
Epoch 250/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6093 - val_loss: 0.5992
Epoch 251/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6093 - val_loss: 0.5994
Epoch 252/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6094 - val_loss: 0.5988
Epoch 253/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6092 - val_loss: 0.5988
Epoch 254/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6092 - val_loss: 0.5989
Epoch 255/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6092 - val_loss: 0.5989
Epoch 256/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6092 - val_loss: 0.5987
Epoch 257/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6091 - val_loss: 0.5987
Epoch 258/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6090 - val_loss: 0.5989
Epoch 259/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6091 - val_loss: 0.5985
Epoch 260/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6090 - val_loss: 0.5981
Epoch 261/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6089 - val_loss: 0.5983
Epoch 262/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6090 - val_loss: 0.5983
Epoch 263/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6089 - val_loss: 0.5981
Epoch 264/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6089 - val_loss: 0.5980
Epoch 265/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6088 - val_loss: 0.5979
Epoch 266/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6088 - val_loss: 0.5979
Epoch 267/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6089 - val_loss: 0.5980
Epoch 268/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6087 - val_loss: 0.5982
Epoch 269/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6087 - val_loss: 0.5981
Epoch 270/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6088 - val_loss: 0.5981
Epoch 271/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6086 - val_loss: 0.5983
Epoch 272/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6087 - val_loss: 0.5979
Epoch 273/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6086 - val_loss: 0.5984
Epoch 274/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6086 - val_loss: 0.5982
Epoch 275/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6086 - val_loss: 0.5979
Epoch 276/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6085 - val_loss: 0.5979
Epoch 277/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6085 - val_loss: 0.5974
Epoch 278/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6085 - val_loss: 0.5981
Epoch 279/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6085 - val_loss: 0.5978
Epoch 280/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6085 - val_loss: 0.5976
Epoch 281/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6084 - val_loss: 0.5979
Epoch 282/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6085 - val_loss: 0.5973
Epoch 283/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6083 - val_loss: 0.5975
Epoch 284/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6083 - val_loss: 0.5978
Epoch 285/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6083 - val_loss: 0.5975
Epoch 286/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6083 - val_loss: 0.5976
Epoch 287/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6083 - val_loss: 0.5976
Epoch 288/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6083 - val_loss: 0.5974
Epoch 289/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6082 - val_loss: 0.5978
Epoch 290/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6083 - val_loss: 0.5972
Epoch 291/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6082 - val_loss: 0.5972
Epoch 292/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6082 - val_loss: 0.5970
Epoch 293/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6082 - val_loss: 0.5966
Epoch 294/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6080 - val_loss: 0.5970
Epoch 295/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6082 - val_loss: 0.5964
Epoch 296/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6081 - val_loss: 0.5970
Epoch 297/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6081 - val_loss: 0.5970
Epoch 298/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6080 - val_loss: 0.5968
Epoch 299/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6080 - val_loss: 0.5972
Epoch 300/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6081 - val_loss: 0.5968
Epoch 1/300
4/4 [==============================] - 0s 21ms/step - loss: 1.0347 - val_loss: 1.0561
Epoch 2/300
4/4 [==============================] - 0s 11ms/step - loss: 1.0303 - val_loss: 1.0524
Epoch 3/300
4/4 [==============================] - 0s 11ms/step - loss: 1.0269 - val_loss: 1.0499
Epoch 4/300
4/4 [==============================] - 0s 11ms/step - loss: 1.0243 - val_loss: 1.0475
Epoch 5/300
4/4 [==============================] - 0s 12ms/step - loss: 1.0217 - val_loss: 1.0455
Epoch 6/300
4/4 [==============================] - 0s 11ms/step - loss: 1.0193 - val_loss: 1.0435
Epoch 7/300
4/4 [==============================] - 0s 12ms/step - loss: 1.0171 - val_loss: 1.0415
Epoch 8/300
4/4 [==============================] - 0s 13ms/step - loss: 1.0149 - val_loss: 1.0395
Epoch 9/300
4/4 [==============================] - 0s 18ms/step - loss: 1.0127 - val_loss: 1.0375
Epoch 10/300
4/4 [==============================] - 0s 13ms/step - loss: 1.0106 - val_loss: 1.0358
Epoch 11/300
4/4 [==============================] - 0s 12ms/step - loss: 1.0085 - val_loss: 1.0341
Epoch 12/300
4/4 [==============================] - 0s 10ms/step - loss: 1.0067 - val_loss: 1.0322
Epoch 13/300
4/4 [==============================] - 0s 13ms/step - loss: 1.0047 - val_loss: 1.0308
Epoch 14/300
4/4 [==============================] - 0s 15ms/step - loss: 1.0027 - val_loss: 1.0288
Epoch 15/300
4/4 [==============================] - 0s 10ms/step - loss: 1.0008 - val_loss: 1.0271
Epoch 16/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9988 - val_loss: 1.0256
Epoch 17/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9970 - val_loss: 1.0238
Epoch 18/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9951 - val_loss: 1.0224
Epoch 19/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9933 - val_loss: 1.0205
Epoch 20/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9915 - val_loss: 1.0190
Epoch 21/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9896 - val_loss: 1.0174
Epoch 22/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9880 - val_loss: 1.0156
Epoch 23/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9860 - val_loss: 1.0141
Epoch 24/300
4/4 [==============================] - 0s 18ms/step - loss: 0.9843 - val_loss: 1.0124
Epoch 25/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9825 - val_loss: 1.0111
Epoch 26/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9808 - val_loss: 1.0093
Epoch 27/300
4/4 [==============================] - 0s 14ms/step - loss: 0.9791 - val_loss: 1.0081
Epoch 28/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9774 - val_loss: 1.0060
Epoch 29/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9755 - val_loss: 1.0045
Epoch 30/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9738 - val_loss: 1.0031
Epoch 31/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9723 - val_loss: 1.0016
Epoch 32/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9708 - val_loss: 0.9997
Epoch 33/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9688 - val_loss: 0.9980
Epoch 34/300
4/4 [==============================] - 0s 13ms/step - loss: 0.9671 - val_loss: 0.9963
Epoch 35/300
4/4 [==============================] - 0s 14ms/step - loss: 0.9653 - val_loss: 0.9945
Epoch 36/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9635 - val_loss: 0.9930
Epoch 37/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9619 - val_loss: 0.9916
Epoch 38/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9604 - val_loss: 0.9897
Epoch 39/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9583 - val_loss: 0.9881
Epoch 40/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9566 - val_loss: 0.9867
Epoch 41/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9551 - val_loss: 0.9846
Epoch 42/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9532 - val_loss: 0.9829
Epoch 43/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9514 - val_loss: 0.9811
Epoch 44/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9496 - val_loss: 0.9795
Epoch 45/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9479 - val_loss: 0.9777
Epoch 46/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9461 - val_loss: 0.9761
Epoch 47/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9445 - val_loss: 0.9743
Epoch 48/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9425 - val_loss: 0.9726
Epoch 49/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9407 - val_loss: 0.9712
Epoch 50/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9392 - val_loss: 0.9691
Epoch 51/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9371 - val_loss: 0.9673
Epoch 52/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9353 - val_loss: 0.9656
Epoch 53/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9336 - val_loss: 0.9638
Epoch 54/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9316 - val_loss: 0.9619
Epoch 55/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9299 - val_loss: 0.9606
Epoch 56/300
4/4 [==============================] - 0s 13ms/step - loss: 0.9282 - val_loss: 0.9585
Epoch 57/300
4/4 [==============================] - 0s 14ms/step - loss: 0.9262 - val_loss: 0.9567
Epoch 58/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9243 - val_loss: 0.9548
Epoch 59/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9225 - val_loss: 0.9530
Epoch 60/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9207 - val_loss: 0.9513
Epoch 61/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9187 - val_loss: 0.9493
Epoch 62/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9168 - val_loss: 0.9475
Epoch 63/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9149 - val_loss: 0.9459
Epoch 64/300
4/4 [==============================] - 0s 12ms/step - loss: 0.9131 - val_loss: 0.9441
Epoch 65/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9110 - val_loss: 0.9419
Epoch 66/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9091 - val_loss: 0.9401
Epoch 67/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9071 - val_loss: 0.9383
Epoch 68/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9052 - val_loss: 0.9365
Epoch 69/300
4/4 [==============================] - 0s 10ms/step - loss: 0.9033 - val_loss: 0.9343
Epoch 70/300
4/4 [==============================] - 0s 11ms/step - loss: 0.9013 - val_loss: 0.9325
Epoch 71/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8993 - val_loss: 0.9303
Epoch 72/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8972 - val_loss: 0.9284
Epoch 73/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8952 - val_loss: 0.9263
Epoch 74/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8931 - val_loss: 0.9243
Epoch 75/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8910 - val_loss: 0.9221
Epoch 76/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8890 - val_loss: 0.9203
Epoch 77/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8869 - val_loss: 0.9181
Epoch 78/300
4/4 [==============================] - 0s 13ms/step - loss: 0.8847 - val_loss: 0.9160
Epoch 79/300
4/4 [==============================] - 0s 19ms/step - loss: 0.8827 - val_loss: 0.9139
Epoch 80/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8804 - val_loss: 0.9116
Epoch 81/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8783 - val_loss: 0.9095
Epoch 82/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8761 - val_loss: 0.9072
Epoch 83/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8738 - val_loss: 0.9049
Epoch 84/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8716 - val_loss: 0.9030
Epoch 85/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8694 - val_loss: 0.9003
Epoch 86/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8670 - val_loss: 0.8982
Epoch 87/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8647 - val_loss: 0.8959
Epoch 88/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8623 - val_loss: 0.8934
Epoch 89/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8599 - val_loss: 0.8911
Epoch 90/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8577 - val_loss: 0.8886
Epoch 91/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8552 - val_loss: 0.8862
Epoch 92/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8527 - val_loss: 0.8837
Epoch 93/300
4/4 [==============================] - 0s 13ms/step - loss: 0.8502 - val_loss: 0.8812
Epoch 94/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8478 - val_loss: 0.8788
Epoch 95/300
4/4 [==============================] - 0s 10ms/step - loss: 0.8453 - val_loss: 0.8762
Epoch 96/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8428 - val_loss: 0.8736
Epoch 97/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8401 - val_loss: 0.8711
Epoch 98/300
4/4 [==============================] - 0s 17ms/step - loss: 0.8376 - val_loss: 0.8686
Epoch 99/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8349 - val_loss: 0.8660
Epoch 100/300
4/4 [==============================] - 0s 19ms/step - loss: 0.8322 - val_loss: 0.8634
Epoch 101/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8295 - val_loss: 0.8605
Epoch 102/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8269 - val_loss: 0.8580
Epoch 103/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8240 - val_loss: 0.8552
Epoch 104/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8213 - val_loss: 0.8524
Epoch 105/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8185 - val_loss: 0.8495
Epoch 106/300
4/4 [==============================] - 0s 13ms/step - loss: 0.8157 - val_loss: 0.8466
Epoch 107/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8128 - val_loss: 0.8437
Epoch 108/300
4/4 [==============================] - 0s 12ms/step - loss: 0.8099 - val_loss: 0.8407
Epoch 109/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8070 - val_loss: 0.8378
Epoch 110/300
4/4 [==============================] - 0s 13ms/step - loss: 0.8042 - val_loss: 0.8351
Epoch 111/300
4/4 [==============================] - 0s 11ms/step - loss: 0.8012 - val_loss: 0.8319
Epoch 112/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7983 - val_loss: 0.8290
Epoch 113/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7953 - val_loss: 0.8261
Epoch 114/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7923 - val_loss: 0.8230
Epoch 115/300
4/4 [==============================] - 0s 13ms/step - loss: 0.7893 - val_loss: 0.8199
Epoch 116/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7862 - val_loss: 0.8168
Epoch 117/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7833 - val_loss: 0.8138
Epoch 118/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7803 - val_loss: 0.8106
Epoch 119/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7772 - val_loss: 0.8073
Epoch 120/300
4/4 [==============================] - 0s 15ms/step - loss: 0.7742 - val_loss: 0.8039
Epoch 121/300
4/4 [==============================] - 0s 15ms/step - loss: 0.7711 - val_loss: 0.8006
Epoch 122/300
4/4 [==============================] - 0s 13ms/step - loss: 0.7680 - val_loss: 0.7974
Epoch 123/300
4/4 [==============================] - 0s 15ms/step - loss: 0.7650 - val_loss: 0.7938
Epoch 124/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7620 - val_loss: 0.7904
Epoch 125/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7588 - val_loss: 0.7869
Epoch 126/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7558 - val_loss: 0.7837
Epoch 127/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7527 - val_loss: 0.7801
Epoch 128/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7498 - val_loss: 0.7768
Epoch 129/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7468 - val_loss: 0.7738
Epoch 130/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7439 - val_loss: 0.7708
Epoch 131/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7412 - val_loss: 0.7678
Epoch 132/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7384 - val_loss: 0.7645
Epoch 133/300
4/4 [==============================] - 0s 13ms/step - loss: 0.7357 - val_loss: 0.7614
Epoch 134/300
4/4 [==============================] - 0s 13ms/step - loss: 0.7330 - val_loss: 0.7585
Epoch 135/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7304 - val_loss: 0.7554
Epoch 136/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7277 - val_loss: 0.7526
Epoch 137/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7253 - val_loss: 0.7500
Epoch 138/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7229 - val_loss: 0.7470
Epoch 139/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7204 - val_loss: 0.7444
Epoch 140/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7181 - val_loss: 0.7418
Epoch 141/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7157 - val_loss: 0.7392
Epoch 142/300
4/4 [==============================] - 0s 14ms/step - loss: 0.7134 - val_loss: 0.7365
Epoch 143/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7112 - val_loss: 0.7342
Epoch 144/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7088 - val_loss: 0.7319
Epoch 145/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7067 - val_loss: 0.7297
Epoch 146/300
4/4 [==============================] - 0s 12ms/step - loss: 0.7046 - val_loss: 0.7275
Epoch 147/300
4/4 [==============================] - 0s 11ms/step - loss: 0.7026 - val_loss: 0.7258
Epoch 148/300
4/4 [==============================] - 0s 10ms/step - loss: 0.7007 - val_loss: 0.7239
Epoch 149/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6988 - val_loss: 0.7220
Epoch 150/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6970 - val_loss: 0.7204
Epoch 151/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6953 - val_loss: 0.7187
Epoch 152/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6936 - val_loss: 0.7170
Epoch 153/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6920 - val_loss: 0.7153
Epoch 154/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6905 - val_loss: 0.7137
Epoch 155/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6891 - val_loss: 0.7121
Epoch 156/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6876 - val_loss: 0.7109
Epoch 157/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6863 - val_loss: 0.7091
Epoch 158/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6852 - val_loss: 0.7073
Epoch 159/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6839 - val_loss: 0.7061
Epoch 160/300
4/4 [==============================] - 0s 17ms/step - loss: 0.6829 - val_loss: 0.7049
Epoch 161/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6819 - val_loss: 0.7036
Epoch 162/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6811 - val_loss: 0.7024
Epoch 163/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6803 - val_loss: 0.7014
Epoch 164/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6795 - val_loss: 0.7004
Epoch 165/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6790 - val_loss: 0.7000
Epoch 166/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6785 - val_loss: 0.6990
Epoch 167/300
4/4 [==============================] - 0s 17ms/step - loss: 0.6779 - val_loss: 0.6983
Epoch 168/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6774 - val_loss: 0.6969
Epoch 169/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6768 - val_loss: 0.6961
Epoch 170/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6763 - val_loss: 0.6952
Epoch 171/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6760 - val_loss: 0.6947
Epoch 172/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6754 - val_loss: 0.6937
Epoch 173/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6750 - val_loss: 0.6932
Epoch 174/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6745 - val_loss: 0.6928
Epoch 175/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6741 - val_loss: 0.6921
Epoch 176/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6736 - val_loss: 0.6915
Epoch 177/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6733 - val_loss: 0.6910
Epoch 178/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6727 - val_loss: 0.6908
Epoch 179/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6724 - val_loss: 0.6904
Epoch 180/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6720 - val_loss: 0.6900
Epoch 181/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6717 - val_loss: 0.6898
Epoch 182/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6714 - val_loss: 0.6896
Epoch 183/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6710 - val_loss: 0.6894
Epoch 184/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6707 - val_loss: 0.6890
Epoch 185/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6701 - val_loss: 0.6884
Epoch 186/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6699 - val_loss: 0.6882
Epoch 187/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6695 - val_loss: 0.6877
Epoch 188/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6692 - val_loss: 0.6873
Epoch 189/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6688 - val_loss: 0.6867
Epoch 190/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6684 - val_loss: 0.6864
Epoch 191/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6683 - val_loss: 0.6862
Epoch 192/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6679 - val_loss: 0.6857
Epoch 193/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6674 - val_loss: 0.6854
Epoch 194/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6670 - val_loss: 0.6852
Epoch 195/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6671 - val_loss: 0.6848
Epoch 196/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6664 - val_loss: 0.6845
Epoch 197/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6663 - val_loss: 0.6843
Epoch 198/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6660 - val_loss: 0.6838
Epoch 199/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6654 - val_loss: 0.6833
Epoch 200/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6652 - val_loss: 0.6835
Epoch 201/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6654 - val_loss: 0.6827
Epoch 202/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6649 - val_loss: 0.6829
Epoch 203/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6647 - val_loss: 0.6827
Epoch 204/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6643 - val_loss: 0.6823
Epoch 205/300
4/4 [==============================] - 0s 15ms/step - loss: 0.6640 - val_loss: 0.6819
Epoch 206/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6640 - val_loss: 0.6816
Epoch 207/300
4/4 [==============================] - 0s 15ms/step - loss: 0.6636 - val_loss: 0.6819
Epoch 208/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6636 - val_loss: 0.6816
Epoch 209/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6634 - val_loss: 0.6809
Epoch 210/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6630 - val_loss: 0.6806
Epoch 211/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6628 - val_loss: 0.6810
Epoch 212/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6628 - val_loss: 0.6806
Epoch 213/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6624 - val_loss: 0.6807
Epoch 214/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6621 - val_loss: 0.6804
Epoch 215/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6621 - val_loss: 0.6800
Epoch 216/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6618 - val_loss: 0.6800
Epoch 217/300
4/4 [==============================] - 0s 15ms/step - loss: 0.6616 - val_loss: 0.6799
Epoch 218/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6615 - val_loss: 0.6792
Epoch 219/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6613 - val_loss: 0.6788
Epoch 220/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6612 - val_loss: 0.6785
Epoch 221/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6609 - val_loss: 0.6786
Epoch 222/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6609 - val_loss: 0.6786
Epoch 223/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6606 - val_loss: 0.6784
Epoch 224/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6605 - val_loss: 0.6782
Epoch 225/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6604 - val_loss: 0.6782
Epoch 226/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6602 - val_loss: 0.6781
Epoch 227/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6599 - val_loss: 0.6780
Epoch 228/300
4/4 [==============================] - 0s 16ms/step - loss: 0.6599 - val_loss: 0.6782
Epoch 229/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6596 - val_loss: 0.6779
Epoch 230/300
4/4 [==============================] - 0s 17ms/step - loss: 0.6596 - val_loss: 0.6777
Epoch 231/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6593 - val_loss: 0.6773
Epoch 232/300
4/4 [==============================] - 0s 18ms/step - loss: 0.6593 - val_loss: 0.6774
Epoch 233/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6590 - val_loss: 0.6772
Epoch 234/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6588 - val_loss: 0.6772
Epoch 235/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6587 - val_loss: 0.6769
Epoch 236/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6585 - val_loss: 0.6768
Epoch 237/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6584 - val_loss: 0.6762
Epoch 238/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6582 - val_loss: 0.6763
Epoch 239/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6581 - val_loss: 0.6762
Epoch 240/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6580 - val_loss: 0.6766
Epoch 241/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6580 - val_loss: 0.6761
Epoch 242/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6577 - val_loss: 0.6762
Epoch 243/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6577 - val_loss: 0.6758
Epoch 244/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6575 - val_loss: 0.6759
Epoch 245/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6574 - val_loss: 0.6758
Epoch 246/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6573 - val_loss: 0.6757
Epoch 247/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6571 - val_loss: 0.6760
Epoch 248/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6571 - val_loss: 0.6755
Epoch 249/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6569 - val_loss: 0.6754
Epoch 250/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6567 - val_loss: 0.6755
Epoch 251/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6566 - val_loss: 0.6757
Epoch 252/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6565 - val_loss: 0.6760
Epoch 253/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6564 - val_loss: 0.6755
Epoch 254/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6563 - val_loss: 0.6756
Epoch 255/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6562 - val_loss: 0.6754
Epoch 256/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6561 - val_loss: 0.6754
Epoch 257/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6560 - val_loss: 0.6753
Epoch 258/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6559 - val_loss: 0.6756
Epoch 259/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6557 - val_loss: 0.6751
Epoch 260/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6558 - val_loss: 0.6748
Epoch 261/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6555 - val_loss: 0.6747
Epoch 262/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6555 - val_loss: 0.6746
Epoch 263/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6554 - val_loss: 0.6750
Epoch 264/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6554 - val_loss: 0.6753
Epoch 265/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6553 - val_loss: 0.6750
Epoch 266/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6551 - val_loss: 0.6754
Epoch 267/300
4/4 [==============================] - 0s 14ms/step - loss: 0.6550 - val_loss: 0.6746
Epoch 268/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6548 - val_loss: 0.6747
Epoch 269/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6549 - val_loss: 0.6749
Epoch 270/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6548 - val_loss: 0.6748
Epoch 271/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6547 - val_loss: 0.6747
Epoch 272/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6546 - val_loss: 0.6745
Epoch 273/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6545 - val_loss: 0.6743
Epoch 274/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6545 - val_loss: 0.6746
Epoch 275/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6543 - val_loss: 0.6750
Epoch 276/300
4/4 [==============================] - 0s 18ms/step - loss: 0.6543 - val_loss: 0.6746
Epoch 277/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6542 - val_loss: 0.6742
Epoch 278/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6540 - val_loss: 0.6742
Epoch 279/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6540 - val_loss: 0.6742
Epoch 280/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6538 - val_loss: 0.6739
Epoch 281/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6538 - val_loss: 0.6735
Epoch 282/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6536 - val_loss: 0.6736
Epoch 283/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6535 - val_loss: 0.6735
Epoch 284/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6534 - val_loss: 0.6739
Epoch 285/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6534 - val_loss: 0.6737
Epoch 286/300
4/4 [==============================] - 0s 10ms/step - loss: 0.6531 - val_loss: 0.6737
Epoch 287/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6532 - val_loss: 0.6738
Epoch 288/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6531 - val_loss: 0.6733
Epoch 289/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6528 - val_loss: 0.6736
Epoch 290/300
4/4 [==============================] - 0s 19ms/step - loss: 0.6529 - val_loss: 0.6738
Epoch 291/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6528 - val_loss: 0.6732
Epoch 292/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6527 - val_loss: 0.6730
Epoch 293/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6525 - val_loss: 0.6733
Epoch 294/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6525 - val_loss: 0.6732
Epoch 295/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6524 - val_loss: 0.6733
Epoch 296/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6523 - val_loss: 0.6732
Epoch 297/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6523 - val_loss: 0.6729
Epoch 298/300
4/4 [==============================] - 0s 12ms/step - loss: 0.6521 - val_loss: 0.6731
Epoch 299/300
4/4 [==============================] - 0s 11ms/step - loss: 0.6521 - val_loss: 0.6729
Epoch 300/300
4/4 [==============================] - 0s 13ms/step - loss: 0.6520 - val_loss: 0.6728
32/32 [==============================] - 0s 2ms/step
32/32 [==============================] - 0s 1ms/step
32/32 [==============================] - 0s 1ms/step
32/32 [==============================] - 0s 2ms/step
In [ ]:
# plot history
fig, axes = plt.subplots(ncols=4, figsize=(25, 6))
axes[0].plot(history1.history['loss'], label='Training loss')
axes[0].plot(history1.history['val_loss'], label='Validation loss')
axes[1].plot(history2.history['loss'], label='Training loss')
axes[1].plot(history2.history['val_loss'], label='Validation loss')
axes[2].plot(history3.history['loss'], label='Training loss')
axes[2].plot(history3.history['val_loss'], label='Validation loss')
axes[3].plot(history4.history['loss'], label='Training loss')
axes[3].plot(history4.history['val_loss'], label='Validation loss')
axes[0].legend()
axes[0].set_title('Class 1')
axes[0].set(
    ylabel='Loss',
    xlabel='Epoch'
)
axes[1].set_title('Class 2')
axes[1].legend()
axes[1].set(
    ylabel='Loss',
    xlabel='Epoch'
)
axes[2].set_title('Class 3')
axes[2].legend()
axes[2].set(
    ylabel='Loss',
    xlabel='Epoch'
)
axes[3].set_title('Class 4')
axes[3].legend()
axes[3].set(
    ylabel='Loss',
    xlabel='Epoch'
)
plt.show()
In [ ]:
# plot the data
plt.scatter(encoded_x1[:,0], encoded_x1[:,1], label='Class 1')
plt.scatter(encoded_x2[:,0], encoded_x2[:,1], label='Class 2')
plt.scatter(encoded_x3[:,0], encoded_x3[:,1], label='Class 3')
plt.scatter(encoded_x4[:,0], encoded_x4[:,1], label='Class 4')
plt.legend()
plt.show()

Questão 6¶

Pesquise sobre redes neurais recorrentes LSTM. Apresente neste estudo aplicações das LSTM deep learning. Seguem abaixo sugestões de aplicações.

  1. Predição de series temporais (exemplo: predição de palavras no texto, ou predição de ações na bolsa de valores, etc.)
  2. Reconhecimento de voz
  3. Processamento de Linguagem Natural
  4. Outra aplicações de livre escolha

As redes neurais LSTM (Long Short-Term Memory) são um tipo específico de redes neurais recorrentes que receberam muita atenção recentemente na comunidade de aprendizado de máquina. De forma geral, as redes LSTM possuem a característica de realimentação, o que permite um efeito de memória de curto e longo prazo. Com isso, a saída é modulada pelo estado dessas células, tornando-se uma propriedade muito importante quando precisamos que a previsão da rede neural dependa do contexto histórico das entradas, e não apenas da última entrada.

Source: [Didática Tech](https://didatica.tech/lstm-long-short-term-memory/)

Exemplo de uma aplicação

Uma das aplicações das redes neurais LSTM é a predição de séries temporais, isto é, tentar prever o próximo valor a partir de valores anteriores. Assim, escolheu-se um problema onde, dado um ano e um mês, a tarefa é prever o número de passageiros de companhias aéreas internacionais em unidades de 1.000. Os dados variam de janeiro de 1949 a dezembro de 1960, ou 12 anos, com 144 observações.

Ou seja, dado o número de passageiros (em unidades de milhares) neste mês, qual é o número de passageiros no próximo mês?

Você pode escrever uma função simples para converter a única coluna de dados em um conjunto de dados de duas colunas: a primeira coluna contendo a contagem de passageiros deste mês (t) e a segunda coluna contendo a contagem de passageiros do próximo mês (t+1) a ser prevista.

In [51]:
# LSTM for international airline passengers problem with regression framing
import numpy as np
import matplotlib.pyplot as plt
from pandas import read_csv
import math
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import LSTM
from sklearn.preprocessing import MinMaxScaler
from sklearn.metrics import mean_squared_error
In [52]:
# convert an array of values into a dataset matrix
def create_dataset(dataset, look_back=1):
	dataX, dataY = [], []
	for i in range(len(dataset)-look_back-1):
		a = dataset[i:(i+look_back), 0]
		dataX.append(a)
		dataY.append(dataset[i + look_back, 0])
	return np.array(dataX), np.array(dataY)
In [53]:
# fix random seed for reproducibility
tf.random.set_seed(7)
In [54]:
# load the dataset
dataframe = read_csv('airline-passengers.csv', usecols=[1], engine='python')
dataset = dataframe.values
dataset = dataset.astype('float32')
In [55]:
# normalize the dataset
scaler = MinMaxScaler(feature_range=(0, 1))
dataset = scaler.fit_transform(dataset)
In [57]:
# split into train and test sets
train_size = int(len(dataset) * 0.67)
test_size = len(dataset) - train_size
train, test = dataset[0:train_size,:], dataset[train_size:len(dataset),:]
In [58]:
print("Train shape: ", train.shape)
print("Test shape: ", test.shape)
Train shape:  (96, 1)
Test shape:  (48, 1)
In [59]:
# reshape into X=t and Y=t+1
look_back = 1
trainX, trainY = create_dataset(train, look_back)
testX, testY = create_dataset(test, look_back)
In [60]:
# reshape input to be [samples, time steps, features]
trainX = np.reshape(trainX, (trainX.shape[0], 1, trainX.shape[1]))
testX = np.reshape(testX, (testX.shape[0], 1, testX.shape[1]))
In [61]:
# create and fit the LSTM network
model = Sequential()
model.add(LSTM(4, input_shape=(1, look_back)))
model.add(Dense(1))
model.compile(loss='mean_squared_error', optimizer='adam')
model.fit(trainX, trainY, epochs=100, batch_size=1, verbose=2)
Epoch 1/100
94/94 - 2s - loss: 0.0431 - 2s/epoch - 24ms/step
Epoch 2/100
94/94 - 0s - loss: 0.0229 - 224ms/epoch - 2ms/step
Epoch 3/100
94/94 - 0s - loss: 0.0163 - 212ms/epoch - 2ms/step
Epoch 4/100
94/94 - 0s - loss: 0.0148 - 232ms/epoch - 2ms/step
Epoch 5/100
94/94 - 0s - loss: 0.0140 - 217ms/epoch - 2ms/step
Epoch 6/100
94/94 - 0s - loss: 0.0130 - 217ms/epoch - 2ms/step
Epoch 7/100
94/94 - 0s - loss: 0.0121 - 217ms/epoch - 2ms/step
Epoch 8/100
94/94 - 0s - loss: 0.0112 - 226ms/epoch - 2ms/step
Epoch 9/100
94/94 - 0s - loss: 0.0104 - 211ms/epoch - 2ms/step
Epoch 10/100
94/94 - 0s - loss: 0.0094 - 220ms/epoch - 2ms/step
Epoch 11/100
94/94 - 0s - loss: 0.0084 - 211ms/epoch - 2ms/step
Epoch 12/100
94/94 - 0s - loss: 0.0074 - 219ms/epoch - 2ms/step
Epoch 13/100
94/94 - 0s - loss: 0.0066 - 221ms/epoch - 2ms/step
Epoch 14/100
94/94 - 0s - loss: 0.0057 - 217ms/epoch - 2ms/step
Epoch 15/100
94/94 - 0s - loss: 0.0050 - 214ms/epoch - 2ms/step
Epoch 16/100
94/94 - 0s - loss: 0.0043 - 214ms/epoch - 2ms/step
Epoch 17/100
94/94 - 0s - loss: 0.0037 - 220ms/epoch - 2ms/step
Epoch 18/100
94/94 - 0s - loss: 0.0032 - 216ms/epoch - 2ms/step
Epoch 19/100
94/94 - 0s - loss: 0.0029 - 210ms/epoch - 2ms/step
Epoch 20/100
94/94 - 0s - loss: 0.0026 - 220ms/epoch - 2ms/step
Epoch 21/100
94/94 - 0s - loss: 0.0024 - 212ms/epoch - 2ms/step
Epoch 22/100
94/94 - 0s - loss: 0.0022 - 226ms/epoch - 2ms/step
Epoch 23/100
94/94 - 0s - loss: 0.0021 - 208ms/epoch - 2ms/step
Epoch 24/100
94/94 - 0s - loss: 0.0020 - 218ms/epoch - 2ms/step
Epoch 25/100
94/94 - 0s - loss: 0.0021 - 217ms/epoch - 2ms/step
Epoch 26/100
94/94 - 0s - loss: 0.0020 - 208ms/epoch - 2ms/step
Epoch 27/100
94/94 - 0s - loss: 0.0020 - 220ms/epoch - 2ms/step
Epoch 28/100
94/94 - 0s - loss: 0.0020 - 215ms/epoch - 2ms/step
Epoch 29/100
94/94 - 0s - loss: 0.0021 - 219ms/epoch - 2ms/step
Epoch 30/100
94/94 - 0s - loss: 0.0020 - 210ms/epoch - 2ms/step
Epoch 31/100
94/94 - 0s - loss: 0.0020 - 222ms/epoch - 2ms/step
Epoch 32/100
94/94 - 0s - loss: 0.0021 - 221ms/epoch - 2ms/step
Epoch 33/100
94/94 - 0s - loss: 0.0021 - 213ms/epoch - 2ms/step
Epoch 34/100
94/94 - 0s - loss: 0.0020 - 214ms/epoch - 2ms/step
Epoch 35/100
94/94 - 0s - loss: 0.0021 - 213ms/epoch - 2ms/step
Epoch 36/100
94/94 - 0s - loss: 0.0020 - 224ms/epoch - 2ms/step
Epoch 37/100
94/94 - 0s - loss: 0.0020 - 207ms/epoch - 2ms/step
Epoch 38/100
94/94 - 0s - loss: 0.0020 - 213ms/epoch - 2ms/step
Epoch 39/100
94/94 - 0s - loss: 0.0020 - 219ms/epoch - 2ms/step
Epoch 40/100
94/94 - 0s - loss: 0.0020 - 223ms/epoch - 2ms/step
Epoch 41/100
94/94 - 0s - loss: 0.0020 - 216ms/epoch - 2ms/step
Epoch 42/100
94/94 - 0s - loss: 0.0020 - 216ms/epoch - 2ms/step
Epoch 43/100
94/94 - 0s - loss: 0.0020 - 212ms/epoch - 2ms/step
Epoch 44/100
94/94 - 0s - loss: 0.0020 - 210ms/epoch - 2ms/step
Epoch 45/100
94/94 - 0s - loss: 0.0021 - 216ms/epoch - 2ms/step
Epoch 46/100
94/94 - 0s - loss: 0.0019 - 229ms/epoch - 2ms/step
Epoch 47/100
94/94 - 0s - loss: 0.0020 - 216ms/epoch - 2ms/step
Epoch 48/100
94/94 - 0s - loss: 0.0021 - 216ms/epoch - 2ms/step
Epoch 49/100
94/94 - 0s - loss: 0.0020 - 211ms/epoch - 2ms/step
Epoch 50/100
94/94 - 0s - loss: 0.0020 - 219ms/epoch - 2ms/step
Epoch 51/100
94/94 - 0s - loss: 0.0020 - 213ms/epoch - 2ms/step
Epoch 52/100
94/94 - 0s - loss: 0.0020 - 209ms/epoch - 2ms/step
Epoch 53/100
94/94 - 0s - loss: 0.0020 - 211ms/epoch - 2ms/step
Epoch 54/100
94/94 - 0s - loss: 0.0020 - 219ms/epoch - 2ms/step
Epoch 55/100
94/94 - 0s - loss: 0.0020 - 210ms/epoch - 2ms/step
Epoch 56/100
94/94 - 0s - loss: 0.0020 - 215ms/epoch - 2ms/step
Epoch 57/100
94/94 - 0s - loss: 0.0020 - 213ms/epoch - 2ms/step
Epoch 58/100
94/94 - 0s - loss: 0.0020 - 214ms/epoch - 2ms/step
Epoch 59/100
94/94 - 0s - loss: 0.0020 - 219ms/epoch - 2ms/step
Epoch 60/100
94/94 - 0s - loss: 0.0020 - 212ms/epoch - 2ms/step
Epoch 61/100
94/94 - 0s - loss: 0.0020 - 217ms/epoch - 2ms/step
Epoch 62/100
94/94 - 0s - loss: 0.0021 - 213ms/epoch - 2ms/step
Epoch 63/100
94/94 - 0s - loss: 0.0020 - 211ms/epoch - 2ms/step
Epoch 64/100
94/94 - 0s - loss: 0.0020 - 221ms/epoch - 2ms/step
Epoch 65/100
94/94 - 0s - loss: 0.0020 - 224ms/epoch - 2ms/step
Epoch 66/100
94/94 - 0s - loss: 0.0020 - 217ms/epoch - 2ms/step
Epoch 67/100
94/94 - 0s - loss: 0.0021 - 209ms/epoch - 2ms/step
Epoch 68/100
94/94 - 0s - loss: 0.0020 - 219ms/epoch - 2ms/step
Epoch 69/100
94/94 - 0s - loss: 0.0020 - 209ms/epoch - 2ms/step
Epoch 70/100
94/94 - 0s - loss: 0.0020 - 214ms/epoch - 2ms/step
Epoch 71/100
94/94 - 0s - loss: 0.0020 - 214ms/epoch - 2ms/step
Epoch 72/100
94/94 - 0s - loss: 0.0020 - 216ms/epoch - 2ms/step
Epoch 73/100
94/94 - 0s - loss: 0.0020 - 222ms/epoch - 2ms/step
Epoch 74/100
94/94 - 0s - loss: 0.0021 - 229ms/epoch - 2ms/step
Epoch 75/100
94/94 - 0s - loss: 0.0020 - 227ms/epoch - 2ms/step
Epoch 76/100
94/94 - 0s - loss: 0.0020 - 211ms/epoch - 2ms/step
Epoch 77/100
94/94 - 0s - loss: 0.0019 - 218ms/epoch - 2ms/step
Epoch 78/100
94/94 - 0s - loss: 0.0020 - 216ms/epoch - 2ms/step
Epoch 79/100
94/94 - 0s - loss: 0.0020 - 209ms/epoch - 2ms/step
Epoch 80/100
94/94 - 0s - loss: 0.0020 - 211ms/epoch - 2ms/step
Epoch 81/100
94/94 - 0s - loss: 0.0019 - 216ms/epoch - 2ms/step
Epoch 82/100
94/94 - 0s - loss: 0.0021 - 220ms/epoch - 2ms/step
Epoch 83/100
94/94 - 0s - loss: 0.0020 - 211ms/epoch - 2ms/step
Epoch 84/100
94/94 - 0s - loss: 0.0019 - 217ms/epoch - 2ms/step
Epoch 85/100
94/94 - 0s - loss: 0.0020 - 214ms/epoch - 2ms/step
Epoch 86/100
94/94 - 0s - loss: 0.0020 - 223ms/epoch - 2ms/step
Epoch 87/100
94/94 - 0s - loss: 0.0021 - 235ms/epoch - 2ms/step
Epoch 88/100
94/94 - 0s - loss: 0.0020 - 231ms/epoch - 2ms/step
Epoch 89/100
94/94 - 0s - loss: 0.0020 - 213ms/epoch - 2ms/step
Epoch 90/100
94/94 - 0s - loss: 0.0020 - 217ms/epoch - 2ms/step
Epoch 91/100
94/94 - 0s - loss: 0.0020 - 228ms/epoch - 2ms/step
Epoch 92/100
94/94 - 0s - loss: 0.0020 - 210ms/epoch - 2ms/step
Epoch 93/100
94/94 - 0s - loss: 0.0020 - 226ms/epoch - 2ms/step
Epoch 94/100
94/94 - 0s - loss: 0.0020 - 209ms/epoch - 2ms/step
Epoch 95/100
94/94 - 0s - loss: 0.0021 - 216ms/epoch - 2ms/step
Epoch 96/100
94/94 - 0s - loss: 0.0020 - 219ms/epoch - 2ms/step
Epoch 97/100
94/94 - 0s - loss: 0.0020 - 214ms/epoch - 2ms/step
Epoch 98/100
94/94 - 0s - loss: 0.0020 - 215ms/epoch - 2ms/step
Epoch 99/100
94/94 - 0s - loss: 0.0020 - 213ms/epoch - 2ms/step
Epoch 100/100
94/94 - 0s - loss: 0.0020 - 221ms/epoch - 2ms/step
Out[61]:
<keras.callbacks.History at 0x7f663c5adb10>
In [62]:
# make predictions
trainPredict = model.predict(trainX)
testPredict = model.predict(testX)
3/3 [==============================] - 0s 4ms/step
2/2 [==============================] - 0s 5ms/step
In [63]:
# invert predictions
trainPredict = scaler.inverse_transform(trainPredict)
trainY = scaler.inverse_transform([trainY])
testPredict = scaler.inverse_transform(testPredict)
testY = scaler.inverse_transform([testY])
In [64]:
# calculate root mean squared error
trainScore = np.sqrt(mean_squared_error(trainY[0], trainPredict[:,0]))
print('Train Score: %.2f RMSE' % (trainScore))
testScore = np.sqrt(mean_squared_error(testY[0], testPredict[:,0]))
print('Test Score: %.2f RMSE' % (testScore))
Train Score: 22.68 RMSE
Test Score: 49.34 RMSE
In [65]:
# shift train predictions for plotting
trainPredictPlot = np.empty_like(dataset)
trainPredictPlot[:, :] = np.nan
trainPredictPlot[look_back:len(trainPredict)+look_back, :] = trainPredict
In [66]:
# shift test predictions for plotting
testPredictPlot = np.empty_like(dataset)
testPredictPlot[:, :] = np.nan
testPredictPlot[len(trainPredict)+(look_back*2)+1:len(dataset)-1, :] = testPredict
In [67]:
# plot baseline and predictions
plt.plot(scaler.inverse_transform(dataset))
plt.plot(trainPredictPlot)
plt.plot(testPredictPlot)
plt.show()

Questão 7¶

Apresente um estudo sobre transferência de conhecimento (transfer learning) no contexto de deep learning.

De acordo com Brownlee (2019), Transfer Learning é um método de aprendizagem de máquina no qual permite que um modelo treinado para uma determinada tarefa seja utilizado como um ponto de partida de um modelo para outro problema com características similares.

Esta técnica é bastante utilizada em tarefas de visão computacional, processamento de linguagem natural, dados os vastos recursos computacionais (principalmente as GPUs) e tempo necessários para desenvolver modelos de rede neural sobre esses problemas, além das emissões de CO2 causadas pelo gasto de energia. Além disso, é possível encontrar modelos treinados por grandes empresas por muitas horas e com uma grande infraestrutura disponíveis de forma gratuita na internet. Isso permite obter excelentes resultados com um tempo de treinamento muito menor do que seria treinar um novo modelo do zero.

É possível utilizar esta técnica a partir dos três passos a seguir:

  1. Selecione Modelo pré-treinado: Escolha um modelo de origem pré-treinado dentre os modelos disponíveis. Muitas instituições de pesquisa lançam modelos em conjuntos de dados grandes e desafiadores que podem ser incluídos no conjunto de modelos candidatos a serem escolhidos.

  2. Modelo de Reutilização: Utiliza o modelo pré-treinado como ponto de partida para um modelo na segunda tarefa de interesse. Isso pode envolver o uso de todo ou de partes do modelo, dependendo da técnica de modelagem utilizada.

  3. Modelo de sintonia: Opcionalmente, o modelo pode precisar ser adaptado ou refinado nos dados do par de entrada-saída disponíveis para a tarefa de interesse. Assim, você pode adicionar mais camadas ao final da rede para adptar ao seu problema.

Exemplos de modelos pré-treinados na área de visão computacional:

  • Oxford VGG Model
  • Google Inception Model
  • Microsoft ResNet Model

Exemplos de modelos pré-treinados na área de processamento de linguagem natural:

  • Google’s word2vec Model
  • Stanford’s GloVe Model

Referências:¶

  • Brownlee, Jason (2019). A Gentle Introduction to Transfer Learning for Deep Learning. Disponível em: https://machinelearningmastery.com/transfer-learning-for-deep-learning/